Minimum qualifications:
- Bachelor's degree in Computer Science, Engineering, or equivalent practical experience.
- 2 years of experience in camera auto focus or Android mobile.
- Experience with coding in C++ or Python.
- Master's degree or PhD in Computer Science, Image Processing, or a related technical field.
- Experience in camera software domains such as camera driver, hardware abstraction layer and algorithm.
- Experiences in machine learning and TensorFlow.
- Experiences with mobile chipset vendors.
About the job
Google's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another. Our products need to handle information at massive scale, and extend well beyond web search. We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day. As a software engineer, you will work on a specific project critical to Google's needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve. We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward.
Want more jobs like this?
Get jobs in Taipei, Taiwan delivered to your inbox every week.
As a Camera 3A Software AF (Auto Focus) Engineer, you will develop auto focus algorithms and will work with Software Engineers to integrate the algorithm into the Android platform and improve performance. You will be in charge of launching the algorithms on mobile devices and work with Image Quality Engineers to fine tune the algorithm for quality and develop automated tuning and evaluating methodology to facilitate the process and reduce manual effort.
In this role, you will also involve technology related to auto focus functions (e.g., Time of Flight, Phase Detection, etc.). You will develop and simulate the in-house algorithm, and integrate the total solution into the Android platform.
Google's mission is to organize the world's information and make it universally accessible and useful. Our Devices & Services team combines the best of Google AI, Software, and Hardware to create radically helpful experiences for users. We research, design, and develop new technologies and hardware to make our user's interaction with computing faster, seamless, and more powerful. Whether finding new ways to capture and sense the world around us, advancing form factors, or improving interaction methods, the Devices & Services team is making people's lives better through technology.
Responsibilities
- Develop PDAF (Phase Diff Auto Focus) algorithms and relevant infrastructure tools, such as simulator/debugger/evaluator/etc.
- Engage with different sensor and module vendors and incorporate the new technology into Google products.
- Implement, improve, and integrate the algorithms onto device platforms.
- Design methodology for automated tuning/testing/simulation to reduce manual efforts.