NorthBay Solutions is an Advanced Consulting Partner for AWS. NorthBay was the first AWS Partner to achieve the “Big Data” competency from AWS, and the first AWS Partner to achieve the “ML/AI” competency from AWS. A significant portion of our projects are joint projects with AWS Professional Services. Both our Global CTO and our VP ML/AI previously worked at AWS.
This position involves helping enterprise clients to build solutions the intersection with these trends:
AWS Cloud Migration (Database, Data Warehouse, Applications & Infrastructure)
AWS Data Lakes/Enterprise Data Platforms
AWS Serverless Application Architecture and Development
AWS Machine Learning and AI
In this role, you will collaborate with NorthBay customers, sometimes working onsite, to:
Understand customer requirements, translate these into specifications
Develop the AWS-based solutions, working with both onshore and offshore teams
Deliver and deploy solutions with customers’ enterprise architecture.
This includes assessing customer needs, re-engineering business intelligence processes, designing and developing data models, and sharing your expertise throughout the deployment process.
Are you currently or recently working with AWS or another major Cloud Provider?
- Ability to present to business solutions to customer CTO and CIO, focusing on time-to-value for cloud solutions
· Interface with Client project sponsors to gather, assess and interpret client needs and requirements
· Attend on-site customer meetings as needed
· Assess use cases for various teams within the client company
· This includes working with Security and Data Governance groups within large enterprises, and Develop a data model around stated use cases to capture client’s KPIs and data transformations
· Advising on database performance, altering the ETL process, providing SQL transformations, discussing API integration, and rethinking the approach of certain KPIs
· Teach Big Data Engineering concepts to a variety of audiences, including data engineers, data architects, data scientists business users, and IT professionals
· Document and communicate product feedback in order to improve user experience
· 15+ years of software implementation/services experience, preferably in the Data Analytics space
· Multiple AWS Certifications ideal
· Experience with AWS Cloud solutions deployed in Fortune 500 clients
· Strong understanding across Cloud and infrastructure components (server, storage, network, data, services and applications) to deliver end to end Cloud Infrastructure architectures and designs
· Ability to simplify presentation of complex enterprise architecture during sales cycle for large multi-year cloud migration projects Fortune 500 clients
· A passion for exploring cloud big data, extending this into Data Engineering, Machine Learning and AI
· Solid writing skills
· Significant experience with System Development Life Cycle (SDLC) Proven analytical, problem solving, and troubleshooting expertise
· Proficiency in SQL, preferably across a number of databases like MySQL, PostgreSQL, Redshift, SQL Server, and Oracle.
· Exposure to developer tools/workflow (e.g., git/github, BitBucket *nix, SSH, Code Pipeline, Teams)
· Experience troubleshooting operations issues
· Optimizing database/query performance
· In depth knowledge of AWS services like EC2, S3, Redshift, RDS, Auto scaling, ELB, VPC, System Manager, IAM, Security & Compliance, Elastic Beanstalk, Cloud Front, CloudTrail, Trusted Advisor, Storage Gateway, Glacier, Route 53, DMS,SMS, CloudWatch, SNS, Direct connect, Lambda, VPN
· In depth knowledge of Big Data AWS services like Hadoop, EMR, Glue, Sagemaker, Redshift, IoT Analytics & ElasticsearchStreaming data tools and techniques such as Kafka, MSK, AWS Kinesis, NiFi, Microsoft Streaming Analytics
· Metadata management, data lineage, data governance, especially as related to Big Data
· ETL (Extract-Transform-Load) tools such as Datastage, Podium or Talend
· Structured, Unstructured, Semi-Structured Data techniques and processes
· Various Big Data Formats – row based and columnar
· Good Exposure in 3rd Party Assessment tools and Migration tools
· Experience with business intelligence tools with a physical model (e.g., MicroStrategy, Business Objects, Cognos).
· Exposure to NoSQL databases like DynamoDB, HBase and Cassandra
· Exposure to SQL technologies (e.g., Athena, Presto, Hive, Spark SQL/Shark, Impala, BigQuery).
· MPP style Data Warehouses ,Data Mart design and implementation
· Demonstrated experience of design and implementation of microservices architecture and RESTful web services preferably using API Gateway
· Demonstrated experience of microservices scalability using Docker, Kubernetes, EKS, ECS and Fargate
· Protecting PII using file/column level encryption for data-at-rest or data in motion
· Experience with CI/CD pipelines using tools like Jenkins, CiCircle, Code Pipeline
· Experience with Devops tools for cloud orchestration like Teradata, Chef, Ansible and Cloudformation
· Experience of utilizing progressive design patterns that make use of serverless technologies, dynamic scaling, event models, cost optimizations and platform services
· Experience of network integration and Cloud connectivity of public and private environments; Ingress and egress of data to and from AWS
· Knowledge of cloud reference architectures and AWS best practices
· Experience with core AWS platform and security architecture, including: service account design, virtual private cloud, network design, subnetting, segmentation strategies, Identity and Access Management design, policies and federation strategies, including private key management
· Ability to learn and assimilate technical information quickly
· AWS Solutions architect certification. AWS Big Data Specialty certification is a plus
· Bachelor's degree or higher
· Ability to travel up to 35%.