Let's build SOMETHING GREAT TOGETHER!

Sr. Software Developers (JOBNO3338)


Job Description

REQUIRED EXPERIENCE:

Master's degree or foreign equivalent degree in Computer Science, Engineering or related + 2 years of experience in the job offered or as a Computer Software Professional OR a Bachelor's degree or foreign equivalent degree in Computer Science, Engineering or related + 5 years of experience in the job offered or as a Computer Software Professional.


Experience must include 2 years with Spark, Scala IDE and Putty Tools. Background check and drug test required.


Coordinate with Claim Finance Team, Member Team, Provider Team and multiple third party services data to deliver their requirements compatibility with all analytical systems. Analyze Business and Functional Requirements and provide inputs and enhancement of applications. Collaborate with SME for establishing technical vision and trade-off between usability and performance needs. Responsible for volume, quality, timeliness and delivery of data science projects along with short-term planning resource planning. Translate complex functional and technical requirements into detailed design and implement them using Big Data technologies such as Spark, Kafka, Hbase, Cassandra, NiFi and Sqoop. Building highly distributed, scalable, multitenant enterprise-wide data pipelines on real-time streaming and batch analytics platform using Hadoop Ecosystem Components. Setting up development environment workspace based on Hadoop Application architecture and other Database Technologies. Make necessary code, configurations changes to various modules to make sure new code changes are integrated properly. Coordinate software installation and monitor requirements functionality to ensure specifications are met. Design and implement distributed and scalable Data Marts using technologies using Spark Streaming, Apache Spark, Apache NiFi, Kafka, Scala IDE, Putty and Flume in creating data streaming solutions. Develop skills in business requirement capture and translation, hypothesis-driven consulting, work stream and project management, and client relationship development. Review Input and Output Specification of other systems which would be integrate various modules with application server specific for Data Pipeline Analytics. Guide your team in adoption of cloud & server-less technologies, good design practices, and find opportunities to simplify and scale. Include complex test cases to cover interaction with other systems and conduct sessions to debug system issues. Address the production issues reported Production Monitoring Team in a timely manner. Deployment and post deployment support of applications in various business areas and application layers. Update the best practice documents based on issues found in code reviews, in testing and in production if they are related to bad coding. Travel/relocate to various unanticipated locations to interact with clients and train users for different short and long term assignments.

LOCATION :

Must be willing to travel/relocate to client sites anywhere in the U.S.





Attach only .doc, .docx, .pdf, .txt [Max file size 4MB]



Let's build SOMETHING GREAT TOGETHER!