Includes a new "app" running the ingestion jobs (aka fetch_news and
find_missing_companies + enrich_company_financials) on a schedule.
This also fixes an issue with the previous schedule implementation by
persisting the schedule in a file that survives new deployment and
continues where it left off.
Created a dataprocessing pipline that enhances the raw mined data with
Organsiation extractions and sentiment analysis prio to moving the data
to the sql db.
The transfer of matched data is done afterword.
---------
Co-authored-by: SeZett <zeleny.sebastian@fh-swf.de>
- [x] add a cli to the webserver to take env variables into account
- [x] add a cli to the data processing that takes enviromental variable
as a valid source into account
- [x] rework the cli for the reset sql command
- [x] rework the cli for the copying of sql data from one db to another
- added a Dockerfile for the thre containers
- added a workflow step to build and placing the container in the
registry
- added a docker-compose.yaml to use the build images
- added a docker compose to build the images locally and a script for
prebuild steps