ABOUT ME

Hello! I'm Laurence. I specialize in Analytics Development, where I combine my interests for data and technology to solve diverse and challenging problems. Throughout my career I've engineered data pipelines, built machine learning models and developed automated reporting workflows. My toolkit, which includes Azure, Docker, Python, R and SQL, empowers me to streamline these processes and consistently deliver innovative and impactful solutions.

Fun Fact: This page has been viewed times!

P.S. This page was inspired by Gwyn's video as part of the Cloud Resume Challenge, an initiative thoughtfully crafted by Forrest Brazeal. The GitHub repo can be found here and the blog post on my journey can be found here.

Certifications

Azure Fundamentals AZ-900

Work

Baylor Scott and White Health

Analytics Developer| September 2022 - Present

  • Collaborated with cross-functional teams to understand data requirements and design scalable and efficient ETL processes using various Azure services, such as Data Factory and Logic Apps, to fulfill these requirements.
  • Boosted productivity by transforming recurring R reports into batch jobs and migrated the process to an Azure Virtual Machine.
  • Migrated desktop based machine learning workflows to an Azure Machine Learning environment. This included refactoring R codebases for autonomous execution and utilizing MLflow for comprehensive model tracking.
  • Mathematica

    Data Analytics Developer| May 2022 - September 2022

  • Developed an R-based data reporting pipeline to process data from the Centers for Medicare and Medicaid Services into customized Excel workbooks, replacing a two-week manual process and greatly saving employee time.
  • Utilized Plotly to create interactive visualizations that offered in-depth insights into Medicare expenditure policy options, aiding researchers in making informed decisions for the Centers for Medicare and Medicaid Services 2023 fiscal year budget.
  • University of Florida

    Application Developer Analyst| February 2019 - May 2022

  • Developed an algorithm in collaboration with researchers, that calculated the Pediatric Sequential Organ Failure Assessment (pSOFA) score from electronic medical records. This algorithm enabled the creation of novel datasets, providing insights into the predictive value of the pSOFA score in assessing mortality risk among critically ill children.
  • Built a Shiny dashboard that provided insights into employee time allocation across projects and automated the creation of monthly project invoices. This dashboard was containerized using Docker and deployed on a Linux server, streamlining the invoice generation process with a user-friendly web interface that allowed invoices to be generated with one click.
  • Designed and implemented ETL pipelines and automated reporting workflows using R, Docker, MySQL and REDCap to support the Screen, Test and Protect initiative at the University of Florida. These solutions improved data accuracy and significantly reduced data processing times, providing critical support to disease investigators in their COVID-19 testing and contact tracing efforts across campus. It was instrumental in reducing the number of new cases and enabled a safe reopening during the pandemic, effectively protecting the health of over 40,000 faculty, staff and students.
  • Data Targeting

    Data Analyst| March 2018 - October 2018

  • Built RMarkdown reports to conduct in-depth analysis of political action committee contributions, focusing on donor retention rates and crafting actionable strategies for fostering donation growth.
  • Collaborated with Data Science team to implement logistic regression models in Python, resulting in the optimization of targeted advertising campaigns.
  • Created tables and queries with PostgreSQL to identify target segments for mail, phone, and digital campaigns.
  • Education

    University of Florida

    B.S. Statistics| August 2017

    Projects

    redcapcustodian

    An R package that simplifies data management activities on REDCap systems. It provides a framework for automating data extraction, transformation, and loading (ETL) work. It supports ETL work within a REDCap, between REDCap projects, between REDCap systems, and with the REDCap database. It provides an extensible set of R functions, a Docker image, and an RStudio Project template upon which a REDCap team can build ETL tasks that serve their REDCap systems and customers.