Before you begin writing a scraper, please check if the exchange offers integration through WebSockets. If it does, implement the scraper using the WebSocket protocol instead of a RESTful API.
To scrape data from an exchange data source, called "MySource" for instance, follow these steps:
Create a new file in the directory pkg/dia/scraper/exchange-scrapers/
with the filename MySourceScraper.go
.
To allow the platform to call your scraper, implement a scraper that conforms to the interface from the scrapers package:
Start building your scraper by writing a function with the signature NewMySourceScraper(exchangeName string) *MySourceScraper
. We suggest that this function calls a mainLoop()
method in a go routine, constantly receiving trade information through the trade channel of MySourceScraper as long as the channel is open.
In order to implement the interface, it is necessary to include the ScrapePair
method, which returns a MySourcePairScraper for a specific pair, so our main collection method can iterate over all possible trading pairs. Note that MySourcePairScraper should respect and implement the interface
Also, to ensure proper error handling and cleanup, it is recommended to include a method Error()
, which returns an error as soon as the scraper's channel closes. Additionally, Close()
and cleanup()
methods should be included to handle the closing and shutting down of channels.
For a better understanding of how to set up a scraper, you can refer to the file, as it provides an example of an existing exchange scraper and its overall structure and logic.
To make the scraper visible to the system, add a reference to it in Config.go
in the dia package:
Add a case for your scraper in the switch statement in the pkg/dia/scraper/exchange-scrapers/APIScraper.go
file:
Finally, add exchange's pairs to config/MySourceExchange.json
config:
Modify the build/Dockerfile-genericCollector
and build/Dockerfile-pairDiscoveryService
files and add the following two lines before the RUN go mod tidy
step:
Build the necessary service containers by running the following commands:
Create a manifest for the new exchange scraper by creating a new mysource.yaml
file. You can refer to existing files for guidance or use the following template:
Before running the manifest, create a new entry in the database for the new exchange:
Deploy the manifest using the following kubectl command:
Hooray 🎉 Your scraper should now be running.
CEX
DEX
Centralized:
Bitfinex: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=Bitfinex -mode=current -pairsfile=true
rm ./collector.go && cp /mnt/env-context/cmd/exchange-scrapers/collector/collector.go ./collector.go && go mod edit -replace github.com/diadata-org/diadata=/mnt/env-context && go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install -a && collector -exchange=Bitfinex -mode=current -pairsfile=true
Bittrex: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=Bittrex -mode=current -pairsfile=true
CoinBase: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=CoinBase -mode=current -pairsfile=true
MEXC: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && collector -exchange=MEXC -mode=current -pairsfile=true
Decentralized:
PlatypusFinance
Orca: go mod tidy -go=1.16 && go mod tidy -go=1.17 && go install && SOLANA_URI_REST=https://try-rpc.mainnet.solana.blockdaemon.tech/ collector -exchange=Orca -mode=current -pairsfile=true