Building a Remote Currency Database with Docker and API Scraping

analysis
code
data wrangling
bash
docker
SQL
R
web scraping
data engineering
databases
A remote SQL currency database built with Docker and daily API scraping, linked with GDP data to study exchange-rate effects.
Author

Oliver F. Anderson, MS

Published

November 25, 2022

Keywords

Docker, SQL, currency database, exchange rates, API scraping, R, GDP

This project focused on building a currency exchange database that updates daily and can be accessed remotely.

Workflow

The workflow combined API scraping, SQL database design, and deployment in a Docker container.

  1. Exchange Rates: Scraped daily USD-based exchange rates from exchangerate.host.
  2. Database Deployment: Ran the SQL database inside a Docker container and hosted it on Railway.app for remote access.
  3. Historical Data: Imported past exchange rate data to align with GDP growth periods.
  4. SQL Views: Created dynamic Views to automatically update monthly average exchange rates.

Results

Fig 1. - Entity relationship diagram for the currency database. Primary keys and linking relationships are shown.

The database design linked exchange rates, GDP data, and currency codes into a single relational structure. Views simplified queries by producing rolling monthly averages.

Using R, I analyzed GDP growth alongside exchange rate shifts. The focus example was the fastest growing economies of 2017.

Fig 2. - Exchange rates compared to USD for the fastest growing economies of 2017. The US dollar performed strongly relative to other currencies.

Discussion

The analysis showed that while some countries had high GDP growth in 2017, their currencies weakened against the US dollar. The Euro, which often tracks closely with USD, also lagged during this period.

This highlights a key point in international comparisons: GDP growth figures can look different once exchange rate effects are considered. A strong dollar can exaggerate the underperformance of other currencies, making raw growth metrics less comparable across borders.

FAQ: Docker, Data Scraping, and Financial APIs

Why use Docker?

Docker made the database portable and easy to deploy on Railway.

Why scrape data instead of downloading CSVs?

Daily API scraping ensured the database stayed current and automatically provides the most updated data.

How often does the database update?

Exchange rates are pulled once per day, while SQL Views provide rolling monthly averages.

Can this workflow be applied to other financial data?

Yes. The structure works for any API-driven dataset where regular updates and historical context are important.

Oliver F. Anderson, MS – Computational Biologist, Data Scientist, and Research Consultant based in Portland, Oregon. I design data-driven solutions in bioinformatics, machine learning, and AI automation for research and biotech.

Back to top