Job Details

Research Engineer - Data Integration Infrastructure

Advertiser
TigerGraph
Location
San Diego, California, United States
Rate
-
Job Title: Research Engineer - Data Integration Infrastructure
at TigerGraph (View all jobs)
San Diego, California
TigerGraph is the world's fastest graph analytics platform designed to unleash the power of interconnected data for deeper insights and better outcomes. We welcome people from all backgrounds who seek the opportunity to help build the next generation graph computing and analytics platform.
TigerGraph is looking for a Research Engineer in TigerGraph Innovation Lab to conduct advanced research and development in the area of data integration between graph database systems and external data sources and targets, including data preparation, transformation and loading, streaming based data ingestion and export, data connectors to many analytics tools. The ideal candidate should be an experienced data engineer versed in state-of-art research and tools in ETL, data loading and data connector, streaming-based data-in and data-out , creative innovator with the curiosity and desire to go beyond what has been done before, as well as proven ability to bring innovative ideas into working prototypes and present to different audiences.
Great insights to TigerGraph and the Graph Space:
Recent press release about TigerGraph Cloud:
TigerGraph completes $105 million series C Funding News:

Responsibilities
  • Research, design and prototype innovative approaches to accelerate large-scale data-in and data-out integration between graph databases and other data sources and targets
  • Compare and contrast different approaches to data integration, derive common frameworks to simplify and speed up data integration
  • Work with product engineering to turn performance optimization algorithms into production implementation
  • Collaborate with external researchers and partners closely to jointly develop new innovative approaches on data integration
  • Author, publish and present innovative ideas in internal and external conferences and online media.
Requirements
  • Bachelor's, Master's or PhD in Computer Science, Maths, Engineering, or a related field.
  • In-depth knowledge on state-of-art research and tools on data integration at scale, including ETL, Spark / DataBricks, BI Tool Integration
  • 5+ years of hands-on experience in implementing large scale data integration capabilities via ETL/Spark, BI Tool Integration, Data Connectors
  • Passion to take on hard and complex problems to have an impact on practical enterprise applications.
  • 'Can do' attitude with a strategic and rigorous mindset and be comfortable working in a fast paced, multi-faceted environment.
  • Demonstrated collaboration skills with peers inside and outside companies.
  • Strong communication skills with both technical and non-technical audiences.

Send application

Mail this job to me so I can apply later

Apply With CV

You are not logged in. If you have an account, log in to your account. If you do not have an account, why not sign up? It only takes a minute!

latest videos

Upcoming Events