Job Description
Reality Labs at Meta is building products that make it easier for people around the world to come together and connect through world-class Augmented, Mixed, and Virtual Reality hardware and software. The Health & Safety team brings together a diverse and interdisciplinary team of research scientists to shape the future of Reality Labs products. The team cultivates an honest and collaborative environment where creative and motivated individuals thrive.
As a research prototyper supporting User Health & Safety, you’ll partner with UX research scientists, mechanical engineers, electrical engineers, and other brilliant minds to build cutting edge prototypes for research. The role requires you to be skilled at collaborating across technical disciplines to inform your work, to be detail-oriented in thinking about both current and future team needs, and to iterate and problem-solve in a fast-paced environment. You will utilize a full range of both technical and interpersonal skills to help our talented researchers drive product impact through innovative and rigorous research.
You’ll foster scientific explorations and generate viable paths to the consumer products that will connect people in meaningful ways for decades to come. There is immense potential for AR and VR to change the world – and we’re just getting started.
Job Responsibilities:
- Develop, debug, and document code using C/C++, bash, shell, C#, Unity, Unreal Engine and/or Python
- Supporting H&S UXR studies by developing testbeds and tools for diverse and innovative hardware prototypes. Anticipated needs include:
- Implementing embedded device firmware to communicate with sensors, display, dimming technologies, and other embedded hardware components and systems
- Hardware and software system integration across sensors, displays, computer-based stimuli, dynamic obstacle courses, and more
- Utilizing/Using API frameworks for appropriate logging, transmitting, and visualization of multi-dimensional data from sensors and tracking systems
- Develop event-triggered algorithms to generate dynamic and/or randomized stimuli (e.g., display content)
- Partner with XFN to understand capabilities and constraints of integrated hardware systems and firmware (including, but not limited to, displays, eye-tracking, cameras, depth sensors, IMU sensors, wearables)
- Collaborating with H&S UXR team members to create flexible, reusable software user interfaces (“virtual labs”) that support adjustment of experimental parameters by study RAs and inclusion of new parameters in the future
- Working with H&S UXR research scientists and/or vendors to provide rapid response and troubleshooting on-site during user research studies
Minimum Qualifications:
- Bachelor’s Degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience
- 3 years’ experience with C/C++, bash, shell, C#, Unity, Android Studio and/or Python
- Experience integrating custom systems including sensors, API/SDK frameworks, and firmware
- Excellent communication skills for understanding and translating technical information to engineers, UXRs, designers, executives
- Track record of operating independently, demonstrating creativity, being detail-oriented, solving ambiguous problems, and delivering results in a highly-organized manner
- Willingness to occasionally travel to study sites
Preferred Qualifications:
- Experience facilitating implementation of closed-loop experimental paradigms, highly-controlled presentations of visual content to external displays of varying types, and collection of system-user telemetry
- Experience working with electrical engineers, firmware developers, software engineers, and hardware prototypers
- Experience developing AR/VR applications
- Experience with Android Studio and/or Unity
- Experience with Python.
This is the pay range that the employer reasonably expects to pay for this position $37.87/hour - $52.43/hour
Optional Benefits: Medical, Dental, Vision, 401K
The Meta CWX Program is enabled by a cutting-edge software platform called TalentNet that leads the contingent labor world for technology innovation. The software platform leverages Machine Learning and Artificial Intelligence to make sure the right people end up in the right job.
At Meta, we are constantly iterating, solving problems, and working together to connect people all over the world. That’s why it’s important that our workforce reflects the diversity of the people we serve. Hiring people with different backgrounds and points of view helps us make better decisions, build better products, and create better experiences for everyone.
We give people the power to build community and bring the world closer together. Our products empower more than 3 billion people around the world to share ideas, offer support, and make a difference.