CURRENT OPENINGS

Contact us for help?

Contact with us through our representative or submit a business inquiry online.

Contact Us

Software Developer

Software Developer with Bachelor’s Degree in Computer Science, Computer Information Systems, Information Technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.

Job Duties and Responsibilities:

  • Design and developer warehouse models in cloud data warehouse like Snowflake or AWS Redshift
  • Design and Develop Apache Spark Transformation applications using Scala/Java/Python to extract the raw data, process and standardize the data.
  • Design and Develop highly available Apache Spark data hub platform cluster.
  • Realizing the architecture that satisfies Non Functional Requirements like Performance, Security, Scalability, Reliability and Maintainability in Big data cluster environments.
  • Develop and deploy Chef scripts on centralized DEV, QA, PROD servers for installing Java 8, Apache Spark, Apache Hive, Apache Zookeeper, Telegraf agent.
  • Guiding the team members on configuring and monitoring complex tools like Ganglia and Grafana.
  • Configure Apache Spark cluster to be multi-tenant by starting External Shuffle Service which enables a cluster to be used by multiple applications.
  • Develop and Deploy the APIs to store and retrieve data from Amazon S3 and Google Cloud Storage Systems.
  • Architect, Design and Develop AWS cross region Disaster Recovery to attain minimal Recovery Point Objective (RPO) and Recovery Time Objective (RTO) which essentially sees that the application and infrastructure are highly available with minimal data loss.
  • Architect, Design Big Data streaming applications to use high performance highly available NoSQL KeyValue store Redis for check pointing.
  • Design and Develop AWS Cloud deployment scripts using AWS Cloud Formation Templates, Terraform and Ansible..

Skills / Knowledge required:

    • Expert knowledge and Work experience as Big Data Technologies .
    • Work experience with Java and Scala.
    • Expert in Big Data Architecture
    • Expert knowledge of Spar,. Kafka. Snowflake, JMS, Jenkins Pipelines, Transformations and Aggregations.
    • Python and Hive.
    • Expert in JetBrains tools.
    • Expert in creation of design documents, develop the code according to design documents.
    • Expert in creation of clusters in any one of the AWS, Azure and Google Clouds.
    • Good verbal, written, and interpersonal communication skills.

Environment / Technologies involved:

AWS Cloud, Apache Spark Framework, Apache Kafka, Scala, Java Messaging Service, JetBrains IDE, Snowflake Datawarehouse, Hive.

Work location is Portland, ME with required travel to client locations throughout USA.

Rite Pros is an equal opportunity employer (EOE).

Please Mail Resumes to:
Rite Pros, Inc.
565 Congress St, Suite # 305
Portland, ME 04101.

Email: resumes@ritepros.com