CURRENT OPENINGS

Contact us for help?

Contact with us through our representative or submit a business inquiry online.

Contact Us

Big Data Engineer

Big Data Engineer with Bachelor’s degree in Computer Science, Computer Information Systems, Information Technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.

Job Duties and Responsibilities:

  • Define the end-to-end solution architecture for large scale technology projects and deep technical expertise in distributed processing, real-time and scalable systems.
  • Architect, Design and Develop Big Data streaming applications to use high performance and highly available NoSQL Key Value store Redis for check pointing.
  • Design and Develop Spark applications in Scala that use DOM/SAX parsers for parsing incoming raw string/XML data.
  • Design and develop Google Cloud deployment scripts using Compute Engine, Cloud Functions.
  • Fine tune applications and systems for high performance and higher volume throughput and Pre-Process using Hive and Pig.
  • Translate load and exhibit unrelated data sets in various formats and sources like JSON, text files, Kafka queues and log data.
  • Design and Develop Kapacitor scripts for alerting as push notifications, SMS, Email and Slack alerts.
  • Define Technology/Big Data strategy and roadmap for client accounts, and guides implementation of that strategy within projects.
  • Drive excellent management skills are required to deliver complex projects, including effort/time estimation, building detailed work breakdown structure (WBS), managing critical path, and using PM tools and platforms.
  • Build scalable client engagement level processes for faster turnaround & higher accuracy.
  • Run regular project reviews and audits to ensure that projects are being executed within the guardrails agreed by all stakeholders.
  • Manage the client stakeholders, and their expectations, with a regular cadence of weekly meetings and status updates.
  • Participate in daily-stand-ups, Off shore Onsite connects, Product Roadmap reviews, Backlog refinements and Story grooming sessions as well as Sprint planning sessions.
  • Inherit an existing product module, features and worked on improvising performance, additional features and NFR’s.
  • Define and deliver integrated solutions by applying proven delivery methodologies including Agile.
  • Support UI/SDK/Backend systems with API Portals, documentation and postman scripts
  • Support the development team in defining standards and best practices.
  • Create AWS S3 buckets, performed folder management in each bucket.
  • Integrate Amazon Cloud Watch with Amazon EC2 instances for monitoring the log files and track metrics.
  • Create monitors, alarms and notifications for EC2 hosts using Cloud Watch.
  • Configure AWS Identity Access Management (IAM) Group and users for improved login authentication.
  • Faster service development by utilizing Virtual backend services in contract-based development
  • Work with Mule Anypoint API Platform on designing the RAML for implemented REST API's
  • Expose RESTful web services in Mule and invoked them using Postman.
  • Create Mule ESB flows using Anypoint Studio, performed payload transformation.
  • Design, develop and test mobile application features at middleware, Microservices layers.
  • RAML creation, asset creation, Implementation of services using best practices, reusable.
  • components and performance expectations.
  • Work with AWS hosted S3, DynamoDB, Elasticache, Database and Kinesis for the services.
  • Implement Analytics, pagination, filtering, configurations loading and caching.
  • Advance flow strategies, RabbitMQ messaging and Data weave transformations.
  • Redis log streaming, Open Test automation test scripts and JMeter performance tests.
  • Work on Anypoint platform VPN, VPC's including load balancers by whitelisting the IP's.
  • Testing of applications under different environments.

Skills / Knowledge required:

  • 2+ years of hands-on experience on Snowflake Data Warehouse Solutions.
  • Expert level hands-on experience in using Snowflake Architecture Design, Role Management, Data Share designs.
  • Expertise in deploying applications by leveraging Copy, Snowpipe, Task, Streams, etc.
  • Expert-level knowledge and hands-on experience in Containerization, Image Building, Packaging, Creating CI/CD Pipelines, managing infrastructure as a code.
  • Expert-level knowledge and hands-on experience in Azure Storage, Data Lake, Delta Lake and lake house architecture.
  • 3+ years of hands-on experience on Databricks, Spark Clusters, PySpark.
  • Good Knowledge of Network/Infra/Security aspects for Azure.
  • Good knowledge of embedding ML into applications, application integration, application identities & security, and developing mobile applications like chatbots by leveraging native Azure services, Knowledge on Power BI.
  • Agile ways of working.
  • Strong analytical and problem-solving skills.
  • Strong written and verbal communication skills.
  • Ability to work independently as individual contributor and as a team member.
  • Experience in AWS EMR, AWS Glue, SQL, ETL Architecture
  • Experience in Data Modeling, API design.
  • Have experience with AWS NoSQL DB (DocumentDB, DynamoDB) & AuroraDB.
  • Implementation experience with AWS ElasticSearch and APIs.
  • Google Cloud Sql, Spanner, DataStore, Bigtable.
  • Java, Spring Boot, Junit.

Technologies / Environment involved:

  • Distributed storage: AWS Cloud Storage (S3), Azure HD Insight, Google Cloud (GCP), Azure Storage
  • Database management: Mongo DB, Cassandra, Postgres, Oracle, MS SQL Server, Redshift
  • Graph Processing: Distributed Graph DB
  • Machine learning: Spark Machine Learning Library (MLlib), TensorFlow, Keras
  • Data processing: Databricks, Spark, Hadoop MapReduce, Pig, Flume, Sqoop, Zookeeper, Yarn, HBase, Kafka and Storm, Airflow, Spark-streaming.
  • Programming Languages: Scala, Python [REST Framework], Shell Scripting
  • DevOps Tools: BitBucket, Git, Apache Maven, Selenium, Jenkins, Docker
  • Cloud Data Warehouse: Snowflake

Work location is Portland, ME with required travel to client locations throughout USA.

Rite Pros is an equal opportunity employer (EOE).

Please Mail Resumes to:
Rite Pros, Inc.
565 Congress St, Suite # 305
Portland, ME 04101.

Email: resumes@ritepros.com