Software Developer with Bachelor’s degree in Computer Science, Computer Information Systems, Information Technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.
Job Duties and Responsibilities:
- Define the end-to end solution architecture for large scale technology projects and deep technical expertise in distributed processing, real-time and scalable systems.
- Design and Develop Big Data streaming applications to use high-performance and highly available NoSQL Key value store Redis for check pointing.
- Design and Develop Spark applications in Scala that use DOM/SAX parsers for parsing incoming raw string/XML data.
- Design and Develop AWS Cloud deployment scripts using AWS Cloud formation templates, terraform and ansible.
- Design, develop and troubleshoot Hive, Pig, Flume, Mango DB, Sqoop, Zookeeper, Spark, MapReduce2, YARN, HBase, Kafka and Strom.
- Fine tune applications and systems for high performance and higher volume throughput and Pre-process using Hive and Pig.
- Translate load and exhibit unrelated data sets in various formats and sources like JSON, text files, Kafka queues and log data.
- Install and configure Docker images for Telegraf, Influx DB, Grafana, Kapacitor on AWS cloud monitoring EC2.
- Design and develop Kapacitor scripts for altering as push notifications, SMS, Email and Slack alerts.
- Define Technology/ Big Data Strategy and roadmap for client accounts, and guide implementation of that strategy within projects.
- Drive excellent management skills are required to deliver complex projects, including effort/ time estimation, building detailed work breakdown structure (WBS), managing critical path, and using PM tools and platforms.
- Build scalable client engagement level processes for faster turn around and higher accuracy.
- Run regular project reviews and audits to ensure that projects are being executed within the guardrails agreed by all stakeholders.
- Manage the client stakeholders, and their expectations, with a regular cadence of weekly meetings and status updates.
Technologies/Environment involved :
- Big Data / Hadoop: Cloudera Manager, Cloudera Distribution Hadoop, HDFS, MapReduce, Hbase, Apache Pig, Hive, Sqoop, Flume, Yarn (MR2), Apache Solr, Impala, Zookeeper, HUE (Hadoop User Experience), Sentry, Oozie, Spark, Key Trustee Server , Key Management Server, Kerberos, Shell Scripting, Cloud Computing Architecture.
- Data Visualization and BI Tools: ETL Process Tools , Dashboard , Data Analytics, Tableau, Alteryx.
- Operating Systems: Red hat Linux, Unix, Windows , Ubuntu, Solaris.
- Databases: Hive, Mongo DB, Cassandra, Postgres, MySQL, Oracle, Redshift.
- Cloud Technologies: AWS, Dremio, Azure.
- DevOps Tools: BitBucket, Git, Apache Maven, Selenium, Jenkins, Docker.
- Graph Processing: Distributed Graph DB.
- Programming Languages: Java, Scala, Python [REST Framework]
- Log monitoring tools: Splunk
Work location is Portland, ME with required travel to client locations throughout USA.
Rite Pros is an equal opportunity employer (EOE).
Please Mail Resumes to:
Rite Pros, Inc.
565 Congress St, Suite # 305
Portland, ME 04101.