Avenue Code is the leading software consultancy focused on delivering end-to-end development solutions for digital transformation across every vertical. We’re privately held, profitable, and have been on a solid growth trajectory since day one. We care deeply about our clients, our partners, and our people. We prefer the word ‘partner’ over ‘vendor’, and our investment in professional relationships is a reflection of that philosophy. We pride ourselves on our technical acumen, our collaborative problem-solving ability, and the warm professionalism of our teams.
About the Opportunity:
We are seeking an energetic and talented Data Scientist Engineer to deliver high value, high-quality business capabilities to our data technology platform. With a background in both software development and machine learning, you will collaborate with software engineers, data scientists and domain experts in Supply Chain and Inventory management to develop data and machine learning products to support our company. You will be an integral member engineering team delivering across multiple business functional areas. You will build data analysis infrastructure for effective prototyping and visualization of various data-driven approaches.
- Partner in building the infrastructure required for optimal extraction, transformation, visualization, and loading of data from a wide variety of data sources using SQL and big data technologies;
- Improve our Machine Learning systems by contributing to all phases of algorithm development including ideation, prototyping, design and production;
- Scale our data processing pipelines to handle and maintain complex processes in an efficient and reliable way that are available, scalable, and fault tolerant';
- Build scalable production systems for data collection, data transformation, feature extraction, model training, and scoring, using distributed software tools;
- Work with data science to define data ingestion standards and assist with data-related technical issues;
- Partner with product development team to understand various opportunities and use cases;
- Maintain specifications and metadata; create and maintain documentation;
- Build Communities-of-Practice in key data technologies.
- Python experience;
- AWS or any other cloud platform;
- SQL (Teradata/Redshift) experience for large datasets.
- Create and maintain data ingestion pipelines.
- Glue, Kafka, Redshift (with a focus on infrastructure-as-code)
- Collaborate on a daily basis with the product team. This includes pairing for all aspects of
- Create and maintain optimal data pipeline architecture.
- Assemble large, complex data sets that meet functional and non-functional business
- Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, and re-designing infrastructure for greater scalability.
- Build the infrastructure required for optimal extraction, transformation, and loading of data
from a wide variety of data sources using SQL and AWS ‘big data’ technologies."
Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records
Does this sound like you?
Apply now to become an Avenue Coder!