Kuiper Production Operations team is looking for a Data Scientist who is motivated to take on challenges and be part of a growing team that analyzes and implements Kuiper controls. As part of the Kuiper Enterprise Technology team, you are responsible for delivering Kuiper control systems and processes to meet Kuiper security and compliance standards. You will be a key liaison with Kuiper service teams, infrastructure teams, Kuiper Security, and Global Trade and Compliance. Have the ability to dive deep, understand, document, communicate and implement controls for IT systems and processes, and be able to drive innovative process changes and automation throughout the Kuiper organization.
We have a team culture that encourages ownership, diversity, inclusion, and innovation. You will have an opportunity to work across the entire Kuiper organization implementing and managing various controls along side Kuiper Security. Kuiper security owns policy and definition of controls, and this role owns the coordination, development, implementation, and change management of controls as well as managing defects and improvements.
Key job responsibilities
* Expertise in implementing machine learning algorithms and generative artificial intelligence to support Activity-Based Intelligence (ABI) methodologies for data monitoring, searching, research, and discovery.
* Proven experience with advanced statistical models, behavioral and population clustering, and predictive analytics applied to automation orchestration.
* Hands-on experience with reference model platforms such as QuickSight, Tableau, Databricks, AWS Glue Catalog, and Palantir.
* Proficiency with programming languages and tools such as Bourne shell, PowerShell, TypeScript, Go, Python, and Jupyter Notebook.
* Strong communication skills, with the ability to discuss and explain complex ML and AI concepts to executive leadership
Export Control Requirement: Due to the need to access certain federal controlled information, you must be a U.S. citizen for consideration.". It is not letting me edit it
BASIC QUALIFICATIONS
- 5+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
- Experience mentoring team members on best practices
PREFERRED QUALIFICATIONS
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience operating large data warehouses
Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. For individuals with disabilities who would like to request an accommodation, please visit https://www.amazon.jobs/en/disability/us.
Our compensation reflects the cost of labor across several US geographic markets. The base pay for this position ranges from $139,100/year in our lowest geographic market up to $240,500/year in our highest geographic market. Pay is based on a number of factors including market location and may vary depending on job-related knowledge, skills, and experience. Amazon is a total compensation company. Dependent on the position offered, equity, sign-on payments, and other forms of compensation may be provided as part of a total compensation package, in addition to a full range of medical, financial, and/or other benefits. For more information, please visit https://www.aboutamazon.com/workplace/employee-benefits. This position will remain posted until filled. Applicants should apply via our internal or external career site.