Senior Data Architect
Beehive Industries.com
100k - 165k USD/year
Office
Loveland, OH; Knoxville, TN; Englewood, CO
Full Time
Beehive Industries is dedicated to Powering American Defense by revolutionizing the design, development, and delivery of jet propulsion systems to support the warfighter. Through the integration of additive manufacturing, the company aims to meet the growing and urgent needs for unmanned aerial defense by dramatically improving a jet engine’s speed to market, fuel efficiency, and cost.
Founded in 2020, the company is headquartered in Englewood, Colorado, with additional facilities in Knoxville, Tennessee, Loveland Ohio, and Mount Vernon, Ohio. Beehive is committed to grow and advance the defense industrial base while manufacturing exclusively in the USA. This role can be hybrid but you would need to live close to one of our facilities.
We are seeking a skilled Senior Data Architect to design, develop, and maintain data pipelines and integration solutions for enterprise systems, including our ERP, NetSuite, PLM, Teamcenter, a custom Manufacturing Execution System (MES), and integration platforms. The ideal candidate will have deep expertise in data engineering, system integration, and hands-on development experience with Palantir’s Foundry platform. The Senior Data architect will support our software developers, database architects, analysts, machine learning and data scientists on initiatives to ensure consistent architecture and delivery. This role requires a strategic thinker who can bridge the gap between business requirements and technical implementation, providing both leadership and execution for our Foundry buildout.
Responsibilities:
- Design and execute ETL/ELT pipelines: Lead the development of robust, scalable data pipelines using Foundry's native tools, including Pipeline Builder and Code Repositories. You will manage the entire data lifecycle, from initial ingestion to transformed, clean, and trusted datasets.
- Establish data connections: Configure and manage secure and reliable data connections to various internal and external sources. This includes databases, APIs, file systems, and other core applications using a variety of protocols.
- Architect Foundry solutions: Lead the design and architecture of end-to-end data solutions, including ontology modeling and operational workflows, ensuring alignment with our business goals.
- Define the Ontology: Codesign, develop, and maintain the Foundry Ontology, defining the core object types, link types, and action types that represent our business entities and their relationships.
- Drive data strategy: Partner with senior leadership and business stakeholders to define conceptual, logical, physical data models.
- Establish governance: Develop and enforce foundational data governance standards, quality metrics, and security controls within the Foundry environment.
- Mentor and guide: Act as the primary subject matter expert, providing guidance and mentorship to other team members on data engineering and Foundry best practices as the team grows.
- Optimize performance: Ensure the scalability, reliability, and efficiency of all data pipelines and architectural components.
Requirements:
- Experience: Minimum 10 years of professional experience in data architecture, data engineering, or a related field, with significant hands-on experience with Palantir Foundry.
- Education: Bachelor's or master's degree in Computer Science, Data Engineering, or a related field.
- Palantir Foundry Core: Deep, demonstrable expertise with Palantir Foundry development and architecture, including:
- Data Connection: Hands-on experience configuring and managing connections to diverse data sources.
- Pipeline Builder: Proven ability to build and deploy complex, scalable ETL/ELT pipelines.
- Code Repositories: Proficiency in developing data transformations using Python/PySpark and Scala.
- Ontology Management: Expert-level knowledge of defining and maintaining the Foundry Ontology, including object types, link types, and action types.
- ETL/ELT and Data Warehousing: Extensive experience in data modeling (relational and graph), data warehousing, and the design of performant ETL/ELT processes.
- Database Connectivity: Hands-on experience connecting to, querying, and manipulating data from various sources using JDBC and ODBC drivers.
- SQL: Strong SQL skills for complex data querying, manipulation, and optimization.
- API Integration: Proven experience integrating data from and writing data back to applications via RESTful and SOAP APIs. This includes managing authentication and handling various request/response formats (JSON, XML).
- File Transfer Protocols: Experience with secure file transfer using protocols such as FTP and SFTP.
- Cloud Platforms: Hands-on experience with at least one major cloud platform (AWS, Azure, and/or Oracle Cloud Infrastructure), particularly with data storage and services relevant to Foundry deployments (e.g., S3, ADLS).
- Problem-Solving: Ability to troubleshoot complex data integration issues, debug pipelines, and design innovative solutions for complex business problems.
- Communication: Excellent communication, collaboration, and presentation skills to effectively work with both technical and non-technical stakeholders.
- Data Connection: Hands-on experience configuring and managing connections to diverse data sources.
- Pipeline Builder: Proven ability to build and deploy complex, scalable ETL/ELT pipelines.
- Code Repositories: Proficiency in developing data transformations using Python/PySpark and Scala.
- Ontology Management: Expert-level knowledge of defining and maintaining the Foundry Ontology, including object types, link types, and action types.
Preferred Qualifications: (any skill that could be a plus)
- Certifications: Palantir Foundry certifications (e.g., Foundry Data Engineer, Foundry Application Developer).
- Enterprise Applications: Familiarity with large-scale corporate systems like ERP, PLM, MES, SCADA, etc.
- Advanced Analytics and AI: Familiarity with Foundry's AIP suite or other machine learning platforms.
- Web Development: Knowledge of web technologies like HTML, CSS, or JavaScript for building advanced interfaces with Slate.
- DevOps: Experience with DevOps practices, CI/CD pipelines, and tools like Git, Docker, or Kubernetes.
- Agile delivery: experience with developing and delivering with agile methodology
The ability to obtain and maintain a U.S. government issued security clearance is required. U.S. Citizenship is required, as only U.S. Citizens are eligible for a security clearance.
If this sounds like you, please submit an application with your resume. This could be the opportunity you are looking for to expand your skills, contribute to a winning team, and work with talented people who love what they do and take pride in our mission.
In compliance with Colorado’s Equal Pay for Equal Work Act, the salary range for this position is $100,000-$165,000 base salary. Please note that wage information is a general guideline only, and we will consider factors such as (but not limited to) scope and responsibilities of the position, candidate’s work experience, education/training, key skills, and market conditions when extending an offer.
Beehive Industries offers a comprehensive benefits package that includes group Medical, Dental, Life, and Short and Long-Term disability coverage from day one. We also offer a generous 401(k) Retirement Savings Plan with a Company match. Every role at Beehive Industries is bonus eligible, and will also receive equity in the company.
Beehive Industries is committed to full compliance with applicable anti-discrimination laws. We are an equal opportunity employer and value diversity at our company. We strive to create an inclusive work environment and will not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
