We're hiring

We at RIK Data Solutions Inc. are a dynamic team and are always on the lookout for exceptional talent. So go ahead and send in your resumes to:

Email : hr@rds-us.com

Phone : 941-527-1464

Fax : 732-358-0208

We are currently seeking for the following positions

Application and DBA Support (2 positions)

Looking for very experienced resources to support an application created within Java and residing on a Hadoop platform. This position will be viewed as a subject matter expert on Hadoop and will help the application teams manage and tweak the application. A former engineering resource is an example of a resource that might have the necessary skill set.

  • An in depth understanding of how the Java application interacts with the various Hadoop layers including HIVE, HBASE, Impala, Spark.
  • Will be well versed in the tool set used to interface with these services
  • Excellent understanding of the complete Hadoop Ecosystem from Administration side
  • Solid understanding of YARN resource manager
  • Sqoop workflow management
  • Will have expertise on the Hadoop layers and how they interact with each other and the application
  • Will have developed signification lines of code in pig or scoop
  • Will have exported the code to Python or Java
  • Will have managed a service such as HIVE or HBASE
  • Will have a working knowledge of the Hadoop security layer
  • Previous banking experience is a nice to have
  • Knowledge of Java is a nice to have
  • Strong CLOUDERA 5.5.1 admin
  • Stong SPARK experience

Data governance

The manager for enterprise data management for Big data environment is responsible for the definition, steward, quality, control, lineage, and access of business data stored in Big Data Platforms. The right candidate must have good understanding of the data management challenges associated with Big Data space and broad experience in defining process and policy, implementing procedures and supporting technology, and overseeing on-going data management.
Individual is expected to possess technical and business skills to help develop and manage our enterprise data management, data classification, data quality, control practices. Candidates should possess strong communications skills and have the ability to translate complex information for consumption by a variety of stakeholders.

  • Responsible for all aspects of data governance and stewardship globally in multi-tenant Big Data environment.
  • Build processes to improve data definition, security, quality and use of data in Big Data environment leveraging Hadoop and Greenplum.
  • Work with business, technology, and controls leaders to define data elements for integration and reporting and to establish effective process and system controls for data create, maintain, delete.
  • Document and communicate standardized global data definitions for the organization.
  • Provide thought leadership to the business and technical teams on best practices for data governance.
  • Manage communication and education of the business users and IS support resources to ensure a compliance with governance.
  • Perform self-assessment of the effectiveness of the process by routine monitoring across data sources and subscribers.
  • Establish policies and procedures, initiating appropriate strategies to improve the efficiency, effectiveness, control, and overall data quality for subscribing systems.
  • Develop, implement, and sustain master data processes

Skills Required

  • Hands-on experience of working on large Big Data projects
  • Good understanding of the data management challenges associated with fast changing multi-tenant Big Data environments
  • Excellent understanding of Hadoop ecosystem and associated data processing tools
  • Very good understanding of the commercial and open source tools for end-to-end enterprise data management with focus on tools being used in Big Data space
  • A self-motivated, seasoned professional who is able to work in a fast-paced and constantly changing environment.
  • Significant technical experience and high level problem solving aptitude
  • Solid leadership and people management skills, including demonstrated ability to coach, mentor and develop employees
  • Ability to work in a highly matrixes and geographically diverse business environment
  • Strong interpersonal skills, a sense of urgency and focus on deadlines, and highly developed time management skills are a must.
  • Ability to perform quality review of reconciliation procedures and provide timely feedback.
  • Ability to build consensus and to make decisions based the information at hand.
  • A strong understanding of Extract, Transform, and Load (ETL) procedures and solutions.
  • Adherence to information security policies and processes to ensure consistent quality of service and compliance with data privacy and data access regulations and best practices.
  • Extensive experience working in a team-oriented, collaborative environment across multiple countries around the world.
  • Prior experience developing and managing Data Architecture is preferred.

Education

MS / BS degree in Computer Science, Mathematics, Engineering or related field with15 years of experience.


Experience

  • 15 years’ experience of working in an information systems environment in variety of roles.
  • 5+ years’ experience leading Data Governance or Data Architecture programs in a complex organizational environment.
  • 3+ years of working in Big Data environment
  • 2+ years of work experience in data management in Big Data environment
  • 3+ years supervisory experience.

Data Operations Wholesale ETL Developer -V2

This position is for an ETL Hadoop associate developer supporting JPMIS Data Operations. The ideal candidate will be responsible for designing, developing and implementing the client’s ‘Wholesale’ data assets by leveraging both internal firm resources as well as external data, and will be results-driven with a passion for transforming large disparate data sets into actionable information. The candidates will be an integral part of building a best-in-class data management and governance framework leveraging Big Data technologies, possessing strong data mining, ETL development and analytical skills, along with exceptional relationship management skills and commitment to delivering high quality solutions.

  • Add new data streams for wholesale bank (includes Investment Bank, Asset Management, Corporate Bank, Wealth Management etc.) to the current IntelliStor data asset.
  • Innovate/Develop new ways of managing, transforming and validating data
  • Effectively partner with the Data Insights team to drive usage and improvement of the IntelliStor data asset that drives analytics within JPMIS and for our strategic partners
  • Adhere to guidelines to ensure consistency, quality and completeness of all JPMIS data assets
  • Apply quality assurance best practices to all work products

Building Relationships and using Influence

  • Self-evident interpersonal skills
  • Recognizes nuances in reading others and leverages these to influence key stakeholders
  • Excellent communicator and listener who is able to present across all levels in the organization

Executing for Results

  • Possesses high levels of energy and endurance
  • Rigorously holds oneself and others accountable for achieving high levels of individual and organizational performance

Business Acumen

  • Consistently evaluates decisions in terms of impact to the business
  • Uncovers hidden opportunities in disparate data and translates them into actionable information
  • Technical expertise regarding data management frameworks and ETL solutions
  • Technical expertise regarding data models and database design development
  • Experience with SQL, ETL frameworks and data visualization tools

Client Manager Interview Notes

  • Investment Banking experience very important—and/or wholesale banking, wealth management, commercial banking, asset management, risk management, treasury
  • Hadoop and related components like Hive, Impala, HBase, Oozie is key.
  • UNIX/LINUX shell scripting is key
  • Extensive Java knowledge is key
  • Must be hands-on coder/programmer ready to hit the ground running

Education

  • Financial Services background or experience preferred – specifically as it relates to products within the Wholesale banking space (e.g., investment/wealth management/asset management/corporate banking in areas like derivatives, equities,)
  • Proven proficiency with development of high performance ETL to transform and load high volume of data
  • Proficiency across the full range of database and business intelligence tools; extracting, transforming, loading, publishing and presenting information in an engaging way
  • Intensive, recent development experience in assessing, sourcing and loading high volume of data
  • Detail oriented with a commitment to innovation

Knowledge/Technical Skills

  • Total experience is 5-10 years in IT.
  • Experience in industry leading Business Intelligence tools
  • Knowledge of Java (Core, Servlets, JDBC) or Python development is required
  • Experience in data mining techniques and procedures and knowing when their use is appropriate
  • Experience utilizing and extending ETL solutions (e.g., Informatica, Talend, Pentaho, Ab Initio) in a complex, high-volume data environment
  • Ability to present complex information in an understandable and compelling manner
  • Knowledge of Big Data technologies (e.g., Hadoop, GreenPlum) is preferred
  • Knowledge of statistics, at least to the degree necessary in order to communicate easily with statisticians is a plus
  • Experience with machine learning is preferred
  • Bachelor's or Advanced Degree in Information Management, Computer Science, Mathematics, Statistics, or related fields desired

Java Python Scala Hadoop

  • Familiarity with Hadoop
  • Strong knowledge of Java, Python or Scala (preferably in the context of Hadoop development)
  • Familiarity with ETL concepts and data modeling
  • Proficient in SQL
  • Knowledge of shell scripts

Apply

Please fill out the form below to apply.