Skip to content

Commit 5c44990

Browse files
committed
readme: add installation, prerequisites & running sections 📝
1 parent 7fdb4d8 commit 5c44990

File tree

1 file changed

+64
-2
lines changed

1 file changed

+64
-2
lines changed

readme.md

+64-2
Original file line numberDiff line numberDiff line change
@@ -122,14 +122,55 @@ This whole process, which we call *matching*, is done continuously for each VDV-
122122

123123
## Installation
124124

125-
todo
125+
There is [a Docker image available](https://github.com/OpenDataVBB/pkgs/container/gtfs-rt-feed):
126+
127+
```shell
128+
# Pull the Docker images …
129+
docker pull ghcr.io/opendatavbb/gtfs-rt-feed
130+
docker pull ghcr.io/mobidata-bw/postgis-gtfs-importer:v4 # needed for importing GTFS Schedule data
131+
132+
# … or install everything manually (you will need Node.js & npm).
133+
git clone https://github.com/OpenDataVBB/gtfs-rt-feed.git gtfs-rt-feed
134+
cd gtfs-rt-feed
135+
npm install --omit dev
136+
# install submodules' dependencies
137+
git submodule update --checkout
138+
cd postgis-gtfs-importer && npm install --omit dev
139+
```
126140

127141

128142
## Getting Started
129143

130144
> [!IMPORTANT]
131145
> Although `gtfs-rt-feed` is intended to be data-source-agnostic, just following the GTFS Schedule and GTFS-RT specs, it currently has some hard-coded assumptions specific to the [VBB deployment](https://github.com/OpenDataVBB/gtfs-rt-infrastructure) it has been developed for. Please create an Issue if you want to use `gtfs-rt-feed` in another setting.
132146
147+
### Prerequisites
148+
149+
`gtfs-rt-feed` needs access to the following services to work:
150+
151+
- a [NATS message queue](https://docs.nats.io) with [JetStream](https://docs.nats.io/nats-concepts/jetstream) enabled
152+
- a [PostgreSQL database server](https://postgresql.org), with the permission to dynamically create new databases (see [postgis-gtfs-importer](https://github.com/mobidata-bw/postgis-gtfs-importer)'s readme)
153+
- a [Redis in-memory cache](https://redis.io/docs/latest/)
154+
155+
#### configure access to PostgreSQL
156+
157+
`gtfs-rt-feed` uses [`pg`](https://npmjs.com/package/pg) to connect to PostgreSQL; For details about supported environment variables and their defaults, refer to [`pg`'s docs](https://node-postgres.com).
158+
159+
To make sure that the connection works, use [`psql`](https://www.postgresql.org/docs/14/app-psql.html) from the same context (same permissions, same container if applicable, etc.).
160+
161+
#### configure access to NATS
162+
163+
`gtfs-rt-feed` uses [`nats`](https://npmjs.com/package/nats) to connect to NATS. You can use the following environment variables to configure access:
164+
- `$NATS_SERVERS` – list of NATS servers (e.g. `localhost:4222`), separated by `,`
165+
- `$NATS_USER` & `$NATS_PASSWORD` – if you need [authentication](https://docs.nats.io/using-nats/developer/connecting/userpass)
166+
- `$NATS_CLIENT_NAME` – the [connection name](https://docs.nats.io/using-nats/developer/connecting/name)
167+
168+
By default, `gtfs-rt-feed` will connect as `gtfs-rt-$MAJOR_VERSION` to `localhost:4222` without authentication.
169+
170+
#### configure access to Redis
171+
172+
`gtfs-rt-feed` uses [`ioredis`](https://npmjs.com/package/ioredis) to connect to PostgreSQL; For details about supported environment variables and their defaults, refer to [its docs](https://github.com/redis/ioredis#readme).
173+
133174
### import GTFS Schedule data
134175

135176
Make sure your GTFS Schedule dataset is available via HTTP without authentication. Configure the URL using `$GTFS_DOWNLOAD_URL`. Optionally, you can configure the `User-Agent` being used for downloading by setting `$GTFS_DOWNLOAD_USER_AGENT`.
@@ -168,7 +209,28 @@ export PGDATABASE="$(psql -q --csv -t -c 'SELECT db_name FROM latest_import')"
168209
169210
### run `gtfs-rt-feed`
170211

171-
todo
212+
```shell
213+
# Run using Docker …
214+
# (In production, use the container deployment tool of your choice.)
215+
docker run --rm -it \
216+
-e PGDATABASE \
217+
# note: pass through other environment variables here
218+
ghcr.io/opendatavbb/gtfs-rt-feed
219+
220+
# … or manually.
221+
# (During development, pipe the logs through `./node_modules/.bin/pino-pretty`.)
222+
node index.js
223+
```
224+
225+
todo: `$LOG_LEVEL`
226+
todo: `$LOG_LEVEL_MATCHING`
227+
todo: `$LOG_LEVEL_FORMATTING`
228+
todo: `$LOG_LEVEL_STATION_WEIGHT`
229+
todo: `$METRICS_SERVER_PORT`
230+
todo: `$MATCHING_CONCURRENCY`
231+
todo: `$MATCH_GTFS_RT_TO_GTFS_CACHING`
232+
todo: `$MATCHING_CONSUMER_DURABLE_NAME`
233+
todo: `$PG_POOL_SIZE`
172234

173235
### Alternative: Docker Compose setup
174236

0 commit comments

Comments
 (0)