The Role: Sr. Data Researcher II
The Team: The Data Sourcing and Normalization group works to collect and manage market fundamental datasets related to the energy industry through a variety of processes that combine analytical and technical skills. This role is responsible for the data collection, content verification, quality control and relevant analysis to support Platts content and products. Industry knowledge combined with analytical acumen, is applied to independently identify and resolve data integrity issues on a daily basis.
The Impact: This role is the foundation of the products delivered. The data collected is the base for the company as it feeds into the products, platforms, and is used for analysis by other teams within the company.
What’s in it for you:
- Perl, IronPython and SQL development experience and growth
- Opportunity to work with a globally operating team
- Being part of Platts/Data Sourcing is a nationally recognized business intelligence provider. Platts provides information to a broad range of customers including large multinational corporations and small companies.
- Accountable for ensuring timely and accurate collection and databased storage. This includes hands on coding ensuring data collected is proofed to source, handling daily exceptions (changes to source data, process errors, verifying suspect data), applying industry best practices, and tracking to operational metrics.
- Accountable for analyzing new assignments, developing scripts and tests to validate data, handling change requests and providing quick and efficient solutions to data sourcing issues.
- Responsible for migrating existing automation scripts from Perl to IronPython framework.
- Responsible for creating requirements, metadata, backup procedures, and process documentation, and overall management of dataset addition process.
- Works with technology, Quantitative, and Editorial teams to ensure that content is delivered on-time, regardless of system issues, and that the format and structure supports the business need.
- Is responsible for following up on exceptions and outliers associated with specific datasets. This may include via industry contacts, 3rd party verification, additional research, and modeling or related analyses.
- Maintains ongoing metrics on data quality, time management, and cost monitoring associated to be used across all Information Intelligence Services group.
- Writes data queries for internal use as required.
- Completes one-time data pulls from database as requested.
What We’re Looking For:
- Total 3-5 years of experience.
- Bachelor’s degree in Information Technology, Computer Information Systems, Computer Engineering, Computer Science or other technical discipline
- Perl and Python or IronPython development experience or experience creating scripts for automated data collecting
- Solid hands-on testing background with the depth of knowledge of Web automation and debugging
- Demonstrated experience in working with users to test, debug and document systems
- Experience with relational database: preferably SQL Server 2008R2 or 2012 and/or Oracle database
- Experience in working with large datasets
- Experience working in an Agile environment
- Complex problem-solving aptitude
- Enthusiastic/self-motivated with ability to work under pressure either independently or in a team environment
- Ability to use organizational/time management skills to prioritize workload
- Expertise in the energy industry