Search by job, company or skills
Job Duties:
1. Create/update risk models to accommodate new and changing inputs, new business logic, and the changing environment (threats, regulations, etc.) to ensure that models (continue to) produce meaningful and actionable outputs for clients
2. Extract and ingest data from API and websites using web crawling tools.
3. Own the creation process of these tools, services, and workflows to improve crawl/scrape analysis reports and data management.
4. Test the data and the scrape to ensure accuracy and quality.
5. Identify and rectify any issues with breaks as well as scale scrapes as needed.
6. Set up processes to run daily and identify quality issues in our database
7. Manage and update database tables and back-end code when quality issues are identified
8. Lead data integrity and integration, including improving the efficiency of current processes to source, clean, check quality, and integrate several dozen sources of cybersecurity- and ESG-related data
9. Guide the work of data analysts involved in extraction and cleansing
10. Suggest improvements to database architecture, where possible
11. Guide IT to build data science capabilities described above into enterprise systems
12. Provide input on refining our ratings methodologies based on input from customers, partners, and available market research
13.Support Sales and Advisors in client conversations as a cyber and ESG data expert, where required
Skills and attributes for success:
1. Solid Python skills (Python3, API integrations, Pandas, etc.)
2. Strong relational database and data modeling skills with SQL
3. Developing automated/scripted approaches to data cleansing, transformation, and analysis
4. Experience running large scale web scrapes; experience with simple natural language processing
5. Familiarity with techniques and tools for crawling, extracting and processing data (e.g. Scrapy, pandas, mapreduce, SQL, DBT, BeautifulSoup, HTML/Javascript, etc).
6. Familiarity with Linux, AWS, Docker and container orchestration technologies (ECS, k8n, etc.)
7. Multivariate data modeling; skilled at pulling insights from data.
8. Experience with system monitoring/administration tools, including handling failure of scripts
9. Experience with version control, open source practices, and code review
10. Familiarity with one or more common data modeling and statistical analysis software packages (e.g. SPSS, Stata); expert in MS Excel
11. Attention to detail, keen eye for quality, and self-motivated to continuously pressure test models and systems
12. Ability to learn and deploy an understanding of the high-level basics of ESG and cybersecurity
13. Work effectively with IT engineers to build data science capabilities into enterprise systems
14. Ability to relate to non-technical clients as a subject matter expert
15. Must be able to work independently; strong project management skills
16. Resourceful, able to articulate creative ways to solve problems and break logjams.
17. High tolerance to ambiguity, willing to roll up their sleeves to get stuff done
18. Willingness to learn and interest in opportunities that take you outside your comfort zone
19. Ability to build strong relationships and engage with multiple stakeholders in different parts of the organization
Ideally, you ll also have:
1. 3-5 years experience in a data and analytics-related role.
2. MSc in Statistics or BTech in Computer Science or Data Science or MBA with Data and Analytics focus
3. Highly fluent scripting experience (Python, R, or similar)
4. Experience using ETL tools and data warehouses (ex: Airflow, SSIS, Snowflake)
5. Experience with Ruby and Ruby on Rails development
Date Posted: 09/08/2024
Job ID: 88073575