Software Engineer, Perception

About the role

You are responsible for giving autonomous aircraft the ability to perceive wildfires—transforming raw sensor data into an accurate, evolving picture of the fire and its environment. By fusing vast quantities of real-time visual, thermal, and geospatial data into reliable intelligence, your systems ensure that first responders and their autonomous suppression aircraft act on truth, not guesswork. When your models decompose the fire front, track fire growth, and identify new spot fires, Rain-equipped aircraft gain the clarity needed to develop effective suppression strategies.

What we do

Rain integrates with early wildfire detection to dispatch autonomous aircraft to suspected ignitions. Once dispatched, Rain’s Wildfire Mission Autonomy System perceives the fire, shares intelligence, develops a suppression strategy, and when approved, completes the water drop and evaluates its efficacy. Combining wildfire mission management, path planning, fire perception, suppression strategy, and suppressant targeting, Rain’s technology gives fire agencies the ability to stop wildfires in their earliest stages, before they grow out of control.

What you will do

  • Apply machine learning and classical techniques to perform real-time mapping of wildfire environments for suppression and fire management onboard aircraft

  • Drive the design, development, testing, and deployment of the perception system, collaborating with engineering and field operations

  • Produce performant software for ingesting sensor data, classifying, segmenting, tracking, localizing, and modeling the wildfire environment on resource limited targets

  • Design experiments, data collection efforts, and curate training/evaluation sets

  • Architect, design, and implement core product applications as well as tools and infrastructure

  • Write performant, well-abstracted software, and improve code quality through design and code reviews

What you have done

  • 6+ years of experience in computer vision and machine learning, including segmentation, localization, and world modeling

  • Strong proficiency in C++14 (or newer) development for resource limited environments

  • Experience designing and developing algorithms for a variety of sensors such as visual and thermal cameras, RADAR, LiDAR, etc.

  • Strong mathematical skills and understanding of probabilistic techniques for uncertainty modeling

  • You thrive in a fast paced, collaborative, small team environment with minimal supervision

  • Excellent analytical and communication skills, demonstrated collaboration with interdisciplinary teams

What will make you a great fit

  • Experience with real-time perception systems for safety critical applications

  • Experience with depth estimation, tracking, geospatial modeling

  • Experience building and maintaining infrastructure for ML model development

  • Experience with GPU processing

  • Familiarity with sensor calibration, camera geometry, kalman filters

  • Familiarity with Linux development and target environments

  • M.S or a Ph.D. in Robotics, Computer Science, Electrical Engineering, or a related field, or equivalent experience

  • You are willing to travel at least 15% of the year for validation testing

Location

This is an on-site position. Our office is located in Alameda, California—just across the Bay from San Francisco. The office is easily accessible by ferry, bike and scooter, and there is ample parking.

Legal Notice

This role involves access to information governed by U.S. export control laws. To comply, applicants must qualify as a “U.S. Person” (U.S. Citizen, lawful permanent resident, refugee, or asylee). Employment offers are contingent upon meeting these requirements.