top of page

EXPERTISE

Screen Shot 2020-02-18 at 12.09.45 AM.pn

financial market prediction

Built data pipeline from over 1400 realtime news and social feeds using Apache Kafka. Development of a natural language processing and sentiment analysis API in python including event detection, all built on top of MongoDB, Avro on Hadoop, using Docker containers. System leverages collective intelligence to provide decision makers with information that can be leveraged for commodities trading

Screen Shot 2020-02-18 at 12.09.48 AM.pn

online payments

Revamped financial reporting: built a data warehouse in the Google Cloud (Big Query, Cloud Storage, Compute Engine) including ELT with custom Python and open-source components; built a comprehensive cost model covering card processing costs, interchange fees, risk and fraud costs, infrastructure and support costs. Led BI/Visualization tool selection.

Screen Shot 2020-02-18 at 12.09.53 AM.pn

cloud migration

Legacy modernization for mobile security company: redesigned petabyte data warehouse, deployed architecture and ETL pipelines; migrating from on-prem RDBMS to AWS system with HDFS, Hive, Redshift using Python, Apache Airflow and lots of SQL. Designed an updated analytical data model – simplified and yet more usable – and created a much more targeted set of reports using Tablea.

Screen Shot 2020-02-18 at 12.09.40 AM.pn

B2B E-COMMERCE CUSTOMER ENGAGEMENT

For a major clothing brand, designed data model that integrates transactional data store with web tracking information; then built a set of enriched analytics, dashboards, and interactive visualizations to quickly understand key insights such as site adoption, engagement, behavior and conversion rates by different B2B segments

Screen Shot 2020-02-18 at 12.09.35 AM.pn

online news

Large media publisher had multiple silo’ed systems and contradictory metrics from different sources. Designed a fully integrated data warehouse in the AWS cloud using Spark, Scala, and Redshift; put Tableau reporting and visualisation in place; developed predictive algorithms using R; implemented an A/B testing framework; and deployed machine learning for recommendations and ad targeting 

Screen Shot 2020-02-18 at 12.09.30 AM.pn

FINANCIAL REGULATION COMPLIANCE

Shepherded a large international bank to comply with new international banking regulations. Modified complex processes across multiple groups; designed changes to multiple data stores and led in-flight implementation

Screen Shot 2020-02-18 at 12.09.23 AM.pn

CROSS-CHANNEL CUSTOMER JOURNEYS

At one of the world’s largest banks, linked comprehensive set of customer behavior data across multiple channels, then created an analysis and diagnostic visualization tool to easily analyse and visualize both positive and problematic journeys to identify improvements that will improve the customer experience and satisfaction 

Screen Shot 2020-02-18 at 12.09.18 AM.pn

ENERGY AND ENVIRONMENT

Designed and deployed data driven products with industry leading energy group – complex energy pricing and carbon modeling algorithms using C#, Python, R, Django, PostgreSQL, and MSSQL, exposed to multiple customers through RESTful services 

Screen Shot 2020-02-18 at 12.04.30 AM.pn

COMBAT GAMBLING ADDICTION

At risk behavior modeling for online gaming company – developed, tested, and implemented a Machine Learning model in R for the identification of harmful customer behaviour. This model is now used in a preventative alert system. Work included extensive feature engineering and the use of exploratory analysis identifying characteristics of game play

Screen Shot 2020-02-18 at 12.04.25 AM.pn

AI ROBOT MANUFACTURING

Collecting sensory data and running it through various machine learning algorithms to detect when food packaging robots are about to break, technicians can then service the robot before a catastrophic failure which is expensive to fix and creates downtime on the line 

Screen Shot 2020-02-18 at 12.04.20 AM.pn

E-COMMERCE CHURN ANALYSIS

A subscription-based fashion e-commerce startup needed to better understand the dynamics of customer interaction with their services and how it led to increased commitment levels or churn. In a targeted time-boxed effort, designed a new analysis model, built a new data mart, and deployed customer explorer and retention analytic models in both Tableau and Looker. These applications helped the client understand the relationship between specific aspects of the customer experience (e.g. shipping times, style advice) with propensities to renew and churn

bottom of page