Big Data Engineer
Big Data Engineer with Bachelor Degree in Computer Science, Computer Information Systems, Information Technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.
Job Duties and Responsibilities:
- Collaborate with System Architects to design, develop, test, and implement efficient Snowflake data models aligned with enterprise data warehousing standards.
- Build and manage cloud and hybrid integrations using third-party ETL tools like Azure Data Factory for seamless Snowflake data pipelines.
- Maintain CI/CD pipelines in GitHub and Azure DevOps to ensure code quality, version control, and automated deployment for Snowflake solutions.
- Conduct SQL performance analysis, query optimization, and database tuning to improve efficiency and scalability in Snowflake models.
- Develop robust, scalable Snowflake integrations and data models following industry best practices.
- Perform root cause analysis on data model issues and resolve performance bottlenecks to maintain reliable Snowflake systems.
- Engage with customers and architects to refine requirements and communicate effectively for business alignment.
- Apply strong time management to prioritize concurrent tasks and deliver Snowflake projects promptly.
- Use analytical and problem-solving skills to address complex warehouse challenges within the Snowflake ecosystem.
Work experience / Technologies required for the position:
- At least 5+ years of experience in building and optimizing Big Data, Data pipelines, architectures and data sets.
- Experience in performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, meta data, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected data sets.
- Working knowledge of message queuing, stream processing, and highly scalable Big Data, data stores.
- Experience in supporting and working with cross-functional teams in a dynamic environment.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including AzureSQL and Snowflake.
- Experience with Azure cloud services: ADLS, AzureSynapse, Azure Databricks, Delta Lake
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Work location is Portland, ME with required travel to client locations throughout USA.
Rite Pros is an equal opportunity employer (EOE).
Please Mail Resumes to:
Rite Pros, Inc.
565 Congress St, Suite # 305
Portland, ME 04101.
Email: resumes@ritepros.com