Software Engineer, Perception

About the role

Rain integrates with early wildfire detection to dispatch autonomous aircraft to suspected ignitions. Once dispatched, Rain’s Wildfire Mission Autonomy System perceives the fire, shares intelligence, develops a suppression strategy, and when approved, completes the water drop and evaluates its efficacy. Combining wildfire mission management, path planning, fire perception, suppression strategy, and suppressant targeting, Rain’s technology gives fire agencies the ability to stop wildfires in their earliest stages, before they grow out of control. 

As a Perception Software Engineer at Rain, you’ll drive the advancement of our fire perception system deployed onboard autonomous aircraft. You will work closely with teammates to design, build, and refine technologies that identify, localize, model, and communicate critical information about wildfire environments. Collaborating with firefighters, pilots and engineers, you will develop the advanced capabilities to ensure that first responders have a real-time operational picture of wildfire conditions.

What you’ll do

  • Apply machine learning and classical techniques to perform real-time mapping of wildfire environments for suppression and fire management onboard aircraft

  • Drive the design, development, testing, and deployment of the perception system, collaborating with engineering and field operations

  • Produce performant software for ingesting sensor data, classifying, segmenting, tracking localizing, and modeling the wildfire environment on resource limited targets

  • Design experiments, data collection efforts, and curate training/evaluation sets

  • Architect, design, and implement core product applications as well as tools and infrastructure

  • Write performant, well-abstracted software, and improve code quality through design and code reviews

What you’ve done

  • 6+ years of experience in computer vision and machine learning, including segmentation, localization, and world modeling 

  • Strong proficiency in C++11 (or newer) development for resource limited environments

  • Experience designing and developing algorithms for a variety of sensors such as visual and thermal cameras, RADAR, LiDAR, etc.

  • Strong mathematical skills and understanding of probabilistic techniques for uncertainty modeling

  • You thrive in a fast paced, collaborative, small team environment with minimal supervision

  • Excellent analytical and communication skills, demonstrated collaboration with interdisciplinary teams

What will make you a great fit

  • Experience with real-time perception systems for safety critical applications

  • experience with depth estimation and tracking, geospatial modeling and algorithms

  • experience building and maintaining infrastructure for ML model development

  • experience with GPU processing

  • familiarity with sensor calibration, camera geometry, kalman filters

  • familiarity with Linux development and target environments

  • M.S or a Ph.D. in Robotics, Computer Science, Electrical Engineering, or a related field, or equivalent experience

Location

This is an on-site position. Our office is located in Alameda, California—just across the Bay from San Francisco. The office is easily accessible by ferry, bike and scooter, and there is ample parking. Some travel for validation testing is expected.