nLine is a technology company dedicated to improving electricity reliability in developing countries through innovative data collection and analysis. We develop and deploy advanced sensor technologies and analytics platforms to provide utilities, regulators, and policymakers with accurate, actionable insights into power grid performance. Our work spans multiple countries, particularly in Sub-Saharan Africa, where we collect and analyze granular data on power quality and outages. By leveraging cutting-edge hardware and software solutions, nLine enables data-driven decision-making for infrastructure investments and operational improvements.
nLine operates with a flat organizational structure, valuing independence, rigor, and versatility. Our work involves developing and maintaining scalable systems that transform raw sensor data into insights that ultimately contribute to enhancing electricity access and reliability in underserved regions. Therefore, scientific integrity is held to the highest regard and a successful candidate should not be comfortable shipping code that negatively impacts the quality of the data collected and presented to partners by nLine. At nLine each person on the team takes ownership of their core technical areas while collaborating across our data processing and infrastructure projects. The ideal candidate should be able to orchestrate both high-level architectural decisions and hands-on development tasks. Additionally, they should be comfortable working autonomously, learning quickly, and delivering high quality work with minimal supervision. They should be comfortable working in a small team environment and be able to adapt to a range of services and platforms.
Maintenance and co-development of core infra products
Manage and supervise cloud computing resources
Lead infrastructure planning and deployment using Terraform
Develop and maintain data processing and visualization systems
Contribute to backend development and integration of auxiliary services (surveys capture)
Troubleshooting and Improvement
Troubleshoot and resolve issues related to ongoing deployment projects (cloud provider resources, corrections to survey data)
Assist in the design and implementation of data storage solutions
Contribute to the ongoing improvement of the company's technical stack
Enhancing Analysis Data Pipeline
Implement and improve caching mechanisms for better performance
Architect and oversee the data analysis pipeline from ingestion to visualization
Education and Experience
Bachelor's degree in Computer Science, Engineering, or related field
3+ years of experience as a software, infrastructure, or devops engineer
Technical Proficiency
Experience with cloud computing platforms, preferably Google Cloud
Strong proficiency in Python
Experience with big data processing and distributed computing
Experience with data visualization and dashboard creation
Soft Skills
Potential experience leading small engineering/data teams
Ability to work independently and collaboratively in a small team, remote environment
Passion for leveraging technology to solve real-world problems
Cloud and Infrastructure:
Deploying infrastructure with Infrastructure as Code practices (i.e. Terraform, Pulumi)
Deploying microservices on Kubernetes clusters (with Docker, Helm, etc)
Using at least one popular cloud provider (i.e. GCP, AWS, etc)
Deploying, managing, and optimizing relational databases (i.e. PostgreSQL, TimescaleDB)
Storing tabular data in datalakes
Architecting and composing multiple microservices to support a cohesive product
Developing data backup and restore strategies and performing risk/cost tradeoffs
Implementing thoughtful security practices (i.e. in storing and distributing secrets to microservices, appropriately managing granular resource access for team members, etc.)
Programming and Data Processing
Python (Advanced, ideally including experience with parallel processing frameworks such as PySpark, Dask, etc)
JavaScript (Intermediate, for maintaining existing JS microservices)
Data Visualization and Analytics
Experience displaying or visualizing data (i.e. in Grafana, Plotly)
IoT & Sensors
Receiving and storing data from remote devices
Optimizing data usage in protocol design
General Engineering Tools
Git, Bash, Unix, etc.
Misc
Experience working remotely, independently, across timezones and with international teams
Familiarity with power grid operations and electricity reliability metrics
Experiences that are nice to have
Experience with Databricks
Familiarity with Helm charts
Knowledge of low-level Python static analysis
Experience with delta lake and parquet file formats
Plotly (or other visualization libraries),
Performance Optimization
Understanding of read and write optimization via chunking/partitioning for both datalake-based and relational datastores
Competitive salary for an engineer with similar qualifications in the area in which you live
Health, dental, and vision covered at 100% up to a certain premium threshold
Fully remote
Unlimited PTO
All federal holidays in the country in which you reside
Closed office between Christmas Eve and New Year’s Day
Subsidies/reimbursements for remote office setup
Subsidies for professional learning opportunities
Paid parental leave
Opportunities to travel to the regions in which we are deploying sensors
nLine is an equal opportunity and affirmative action employer. nLine will not discriminate on the basis of race, religion, national origin, sex, gender identity, sexual orientation, age, physical or mental disability, veteran status, or any other protected group. All employment is decided on the basis of qualifications, merit, and business need.
Applications will be accepted on a rolling basis.