In order to seed your database and keep it up-to-date with Intrinio stock price data, follow the steps below.
First you will need to create database tables that correspond to the various entities that compose this dataset.
The table for stock exchanges should look like this:
Which corresponds to the Stock Exchange API format:
Note that the primary key is the Intrinio id field, which is text. If you would prefer to key your tables with integers, add an intrinio_id column instead, with a unique constraint.
You may want to add a sync boolean field for specific stock exchanges that you wish to keep up-to-date.
A stock exchange has many securities trading on it. The table for securities should look like this:
You will likely want to add database indexes on various security columns, such as ticker, currency, figi, stock_exchange_id, etc. This will depend on your use cases.
You may want to add a sync boolean field for specific securities that you wish to keep up-to-date, later on.
A security has many stock price records. The table for stock prices should look like this:
A security’s stock price may be adjusted for splits and dividends. These are stored in a stock price adjustments table, which should look like this:
Now that your schema is in place, it’s time to use the Intrinio API to seed your tables and keep them up-to-date.
First, let’s use the All Stock Exchanges API to get all stock exchanges. This endpoint returns all exchanges, so no paging is necessary.
Load these records into the stock_exchanges table.
For each stock exchange that you are interested in, fetch the securities in that exchange using the Securities by Exchange API. Use the stock exchange id as the identifier, for maximum specificity. Use Paging so that you get all of the results.
Load these records into the securities table.
Now that you have securities in your database, let’s populate their stock prices using the Stock Prices by Security API. Iterate each security and use its id as the idenifier, for maximum specificity. Use Paging so that you get all of the results.
Load these records into the stock_prices table, making sure to append security_id to each record, based on the id of the security used.
Load these records into the stock_price_adjustments table, making sure to append security_id to each record, based on the id of the security used.
There are two methods to keep your database up-to-date as daily stock prices continue to accumulate:
This requires following the Seeding steps above on a nightly basis. You must either clear each database table before re-seeding or incorporate an upsert mechanism. This method is the safest but also the slowest. Use cases include:
Updating your database incrementally is a fast way to get the freshest data, but it does require a few considerations that we will outline below. The ideal use cases include:
In order to incrementally update your database, perform the following steps for each stock_exchanges record for which you want to get up-to-date data:
Stock price adjustments affect a security’s entire price history. A stock split of 2:1 causes a security’s trading price to fall by half the following trading day, since everyone’s shares were effectively doubled. In order to do meaningful analysis on stock prices, you have to adjust the price historically for both splits and dividends. The Intrinio API provides this data via adj_ fields, but in order to keep your database up-to-date, you have to re-fetch a security’s entire price history when there is a new adjustment.
Even though incremental updates are an effective way to keep your database up-to-date, we recommend that you perform a Full Rebuild every weekend. This will allow you to take advantage of corrections made by the Intrinio Data Quality team, as well as smoothing out any failed updates or other inconsistencies.
Please consider the following topics when writing your data syncing operation.
Your Intrinio account has certain limits for accessing our API. The most common limit is requests-per-minute, which determines how many HTTP requests you can send our API in a wall-clock minute before it start to respond with 429 errors. See the Limits section of our API reference for more details.
In order to maximize performance and reliability, we recommend that you use the default page size of 100 when accessing our API (omitting page_size parameter does the same). This default page size is the optimal setting for ensuring fast, consistent API responses.
Integrating with an API requires a lenient mindset towards availability. There may be intermittent network issues or a deploy gone wrong that will prevent 100% of your requests from being satisfied. In order to make your data syncing operation robust, we recommend using a “retry” library or code module to automatically retry failed requests several times before throwing an exception. Here are a few examples:
Another way to make your data syncing operation more robust is to use a job processing system. A job processing system can provide you with several important benefits:
Here are some examples of job processing systems:
You may also consider writing your own. Intrinio uses a custom-built, sophisticated job processing system in order to power our data processing engine.