Home
European projects

LLM-FORECASTER

The aim of the project is development of an innovative LLM-Forecaster system for automatic time series forecasting based on Large Language Models and Retrieval-Augmented Generation technique. 

The project consists of three tasks:

Task 1 – Industrial Research
This stage focuses on developing a module for automated data collection, processing, and selection from multiple real-time sources. The output will be optimized datasets for time series forecasting.
Outcome: improved prediction accuracy of LLM-based models and development of effective anomaly detection methods.

Task 2 – Industrial Research
This stage involves designing and evaluating innovative forecasting methods using LLM architectures.
Outcome: creation of a unique LLM Forecaster algorithm that increases prediction accuracy across key business sectors such as finance, transport, sales, and logistics.

Task 3 – Experimental Development
This stage integrates results from previous phases into a complete time series forecasting module, including user and API interfaces.
Outcome: a fully functional and validated LLM Forecaster system, tested with an external client, capable of large-scale data processing and user-friendly result presentation.

Target groups:

- companies operating in the financial, trade, logistics, and transport industries.

- enterprises leveraging data analysis and AI solutions,

- business-to-business looking for prediction and processes’ optimisation tools.

Aim of the project:
Development of a unique technical solution enabling automatic, precise and quick prediction of occurrences, based on the large data sets.

Project’s results:

- development of new methods and algorithms based on LLM and RAG,

- creation of new deployment-ready LLM-Forecaster system,

- implementation of the solution into the applicant’s business operations in the scope of 6 months from the project end.

Total project costs: PLN 9 688 091,11

Grant value: PLN 4 352 916,87

#FunduszeUE #FunduszeEuropejskie #EuropeanFunds #UEFunds

Talk to Our Expert! We Share Our Knowledge.