About Clearwater Analytics:
Clearwater Analytics (NYSE: CWAN), a global, industry-leading SaaS solution, automates the entire investment lifecycle. With a single instance, multi-tenant architecture, Clearwater offers award-winning investment portfolio planning, performance reporting, data aggregation, reconciliation, accounting, compliance, risk, and order management. Each day, leading insurers, asset managers, corporations, and governments use Clearwater's trusted data to drive efficient, scalable investing on more than $6.4 trillion in assets spanning traditional and alternative asset types.
Our mission: To be the world's most trusted and comprehensive technology platform that simplifies the entire investment lifecycle and eventually revolutionizes the world of investing.
Job Summary:
A Data Engineer designs, develops, and maintains data systems and architectures to collect, store, process, and analyse large volumes of data. They build data pipelines, optimize data models, and ensure data quality and security. They collaborate with cross-functional teams to meet business objectives and stay updated with emerging technologies and industry best practices.
Responsibilities:
- Designs, builds, and oversees the deployment and operation of technology architecture, solutions, and software to capture, manage, store, and utilize structured and unstructured data from internal and external sources.
- Establishes and builds processes and structures based on business and technical requirements to channel data from multiple inputs, route appropriately and store using any combination of distributed (cloud) structures, local databases, and other applicable storage forms as required.
- Develops technical tools and programming that leverage artificial intelligence, machine learning and big-data techniques to cleanse, organize and transform data and to maintain, defend and update data structures and integrity on an automated basis.
- Creates and establishes design standards and assurance processes for software, systems, and applications development to ensure compatibility and operability of data connections, flows and storage requirements.
- Reviews internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs.
- Coordinates with data providers around planned changes to raw data feeds
- Profile and optimizes data pipelines.
- Can create a well-documented and tested data flow using established processes when given loosely defined requirements.
Required Skills:
- Strong programming skills in Python, Java, or Scala.
- Experience with big data technologies like Hadoop, Spark, or Kafka.
- Proficiency in SQL and database technologies.
- Familiarity with ETL processes and tools (e.g., Airflow).
- Experience with cloud platforms (AWS, Azure, GCP).
- Familiarity with data visualization tools (Tableau, Power BI).
- Strong computer skills, including proficiency in Microsoft Office.
- Excellent attention to detail and strong documentation skills.
- Outstanding verbal and written communication skills.
- Strong organizational and interpersonal skills.
- Exceptional problem-solving abilities.
Education & Experience:
- Bachelor's degree in Technical Background or a related field is required.
- 4-10 Years of experience.