At Elliptic we believe cryptocurrency will play a huge role in the future of value transfer, and we care deeply about helping to build this future. In order for cryptocurrency to flourish, it's important to prevent criminal abuse of the technology. Elliptic is the global leader in detecting, preventing, and pursuing criminal activity in cryptocurrencies. Our clients include the world's leading cryptocurrency exchanges, financial institutions and government agencies.
Our unique platform gives us an unparalleled understanding of cryptocurrency capital flows, using a combination of network science and machine learning to aggregate and interpret vast quantities of transaction data. We provide anti-money laundering (AML) compliance software and investigative services to the leading participants in the cryptocurrency ecosystem. Customers rely on us to analyse more than $150bn of their transactions every month, and include cryptocurrency businesses, major financial institutions, and federal government agencies.
The company has offices in London, UK, New York, NY, and Arlington, VA. We are backed by Octopus Ventures, SignalFire, Paladin Capital, Santander InnoVentures, and Digital Currency Group.What's the role?
Elliptic is looking for an ambitious, passionate data professional to help expand our cutting-edge blockchain analysis platform.
Data engineering is a core enabler of Elliptic's offerings. We leverage open source technologies but have also written our own high-performance in-memory data pipelines and analytics engine. With your help, we can extend the breadth and depth of our data curation and analytics capabilities.What you'll do:
Overall you will collaborate with engineers and data scientists to develop high performance, flexible data pipelines, visualisations, and analytical systems.
What is the work like? What are the challenges?
- Immerse yourself in blockchain - how it works and the various services that use them
- Build robust systems to curate harvested data, combine it with public blockchains and validate the output
- Enhance tooling to enable a broad range of analytics and experiments
- Help make architectural and data model decisions (and implement them!)
- Research, prototype, and recommend new technologies and frameworks
- Help other developers - give advice, perform code reviews etc.
- Help to create a blockchain independent data model of cryptocurrencies, that allow us to build complex analytics across all our data sets.
Elliptic's products are at the cutting edge of blockchain analytics. That means we are often tackling difficult problems in uncharted waters. There are more ideas than we have time to work on, the upside being there is always work to do covering lots of different areas. There is lots to discuss and fascinating work is the norm - working at Elliptic is never boring!
From a data engineering point-of-view the biggest challenges are:
- Ensuring a single source of truth and a robust ontology for recording facts about participants on the blockchain
- Balancing high-performance data processing with flexible querying and auditability
- Keeping track of downstream impacts of changes to the data platform
- Staying on top of the various blockchain developments, forks, etc. that could affect our systems
Bonus points for:
- 3+ years industry experience with Python, Scala or Java - all three would be ideal!
- Deep knowledge of architecting data pipelines and modelling for analytic scenarios (SQL or NoSQL)
- Hands-on use of workflow management frameworks (e.g. Airflow, Luigi, Apache NiFi)
- A fast learner with enthusiasm for learning new technologies and applying them appropriately (avoiding one-size-fits-all solutions)
- Proficiency with Linux
- Rigour in engineering best-practices (documentation, code reviews, test automation)
- A passion for solving difficult problems
- Excellent attention to detail and understanding requirements beyond the written word
- Strong communication skills
- Strong CS fundamentals, including good working knowledge of algorithms, data structures, and concurrency.
- Familiarity with DevOps tools (e.g. Terraform, Docker, Ansible, Hashicorp etc.)
- Experience with graph databases (e.g. Neo4j)
- Experience with cloud-based computing (e.g. AWS, GCE, Azure)
- Data warehousing experience (e.g. Redshift)
- Exposure to real-time analytics and big data technologies (e.g. Spark, S3, Kafka/Kinesis etc.)
- Appetite for discussions about economics, money, identity and privacy
- Interest in cryptocurrencies
- Knowledge of and liking for a functional programming approach
- Share options
- Private health insurance
- Work pension scheme
- Shiny laptop and multiple monitors
- Budget for training materials, events, and conferences
- Quarterly full day offsites
- Annual company 3 day offsite
- Coffee and beer!