- Offer expired 23 days ago
- B2B contract
- senior specialist (Senior)
Technologies we use
About the project
As a key member of the Equinix’s IT Data Science team you will be setting standards in designing, building, scaling and operating innovative end to end machine learning solutions as well as data collection, data wrangling and data transformation for data coming from various source systems. You will own the data pipeline end to end of various projects and experiments. You will also be involved in design, development, implementation and maintenance of various Machine Learning solutions and data products built by our team.
Job Profile Summary
Work with various team members of Analytics and Data Science team to design, develop and deploy various ML/AI solutions. You will not only work on architecting and designing your projects but also influence the design of other projects as well. You will also be concentrating on data discovery and researching new use cases that will bring value to the company. The role is hands on individual contributor role.
Massive data: You will source / examine, engineer data pipelines for gigabytes/terabytes of structured and unstructured data with our platform to create value for customers.
Production deployment: You will be responsible for integration and deployment of the machine learning pipelines into production where your ideas will come to life.
Linux hacking: You will be masterfully using the command line, including tools like vi/emacs and understanding beyond basics of grep, bash, awk, sed, etc to aggressively dive into data, systems, and compute platforms to get the results you are seeking.
Pushing the limits: This role will be on the cutting edge of our Data / Machine Learning platform. As we push to solve more of our ML/AI challenges, you will be prototyping new features, tools and ideas. Innovate at a very fast pace to maintain our competitive edge.
Standardization: You will be fully involved in optimization of various process and coming up with a standardization of various approaches in solving usecases/problems
Collaboration: Coordinate and work with cross functional teams, sometimes located at different geo locations.
CS fundamentals: You have earned at least a B.S. (MS / PhD desired) in Computer Science, or related degree and you have a strong ethos of continuous learning.
Software engineering: You have 5+ years of professional software development experience using Python and SQL, with version control (GIT) for production purposes, with good analytical & debugging skills. Experience with ML packages like pandas, numpy, scikit-learn, pyspark.
Machine Learning: You have 2+ years of experience with machine learning modeling, creating prototypes and evolving them into full-fledged products that are served in production environments. Understanding of main concepts like classification, regression, time series, NLP, anomaly detection, clustering and other common problems.
Environments: You have worked in at least one cloud environment like GCP (preferred), AWS or other cloud data platforms. Understanding of different cloud components to build ML solutions (Seldon.ai, Kubeflow, Vertex AI or similar).
Data Modeling: Experience and expertise in SQL is a must. Flair for data, schema, data model, how to bring efficiency in data modeling for efficient querying data for analysis, understands and develops data validation techniques.
Project management: You demonstrate excellent project and time management skills, exposure to scrum or other agile practices in JIRA.
Fluent in English
MLOps: Practical experience in MLOps use cases such as: Feature Store, Model monitoring, Data and model versioning, CI/CD, Github Actions.
AutoML: Experience and interest to work with AutoML software (like H2O Driverless AI, GCP AutoML or others)
Graph databases: Experience in evaluating, designing and implementing scalable solutions that leverage graph DBs (Neo4j, TigerGraph or similar).
Real Time Systems: Understands evolution of databases for in-memory, NoSQL & indexing technologies along with knowledge on real-time & stream processing systems like kafka, Google pub/sub, Dataflow or similar solutions.
Equinix is among the fastest growing data center companies, growing connectivity between clients worldwide. That’s why we're always looking for creative and progressive individuals who can help us achieve our goal of global interconnection. With 200 data centers in over 24 countries, spanning 5 continents, we are home to the Cloud, supporting over 1,000 Cloud and IT services companies that are directly engaged in technological innovation and development. We are passionate about further evolving specific areas of software development, software and network architecture, network operations, and complex cloud and application solutions.
Equinix makes the internet work faster, better, and more reliably. For this purpose, we hire talented individuals who thrive on solving hard problems. In return, we give them opportunities to hone new skills, experiment with new approaches, and grow in new directions. Our culture is at the heart of our success and it’s our authentic, humble and gritty people who create The Magic of Equinix. We all share a passion for winning and put the customer at the center of everything we do.
Successful candidate will
Be a talent multiplier who gets the team around them to excel
Be persistent, creative and driven to get results relentlessly
Exhibit a strong backbone to challenge the status quo, when needed
Exhibit a high level of curiosity, keeping abreast of the latest trends & technologies
Show pride of ownership and strive for excellence in everything undertaken