The use of optimisation problems helps organisations manage their resources, time, and costs of their processes. The increase in the amount of consumable data, as well as the emergence of more sophisticated use cases, lead to a high number of variables and input data. Consequently, the resolution of huge quantities of Constraint Optimisation Problems (hereinafter, COPs), and the complexity of them, takes an extra effort to model and solve. In this paper, we aim at helping stakeholders model their COPs and integrate different data sources with the COP variables and input data, and solving the COPs in a distributed environment by means of Big Data techniques.In order to face these challenges, we developed FABIOLA (Fast Big Constraint Laboratory). It allows solving COPs from large datasets in a systematic way. It relies on several modern Big Data technologies, and we provide a user-friendly interface to facilitate the COP modelling, the execution of them, and the analysis of the results.The approach is applied to an industrial scenario, in which several electricity wholesale companies employ constraint optimisation techniques to optimise the tariff which their customers might hire. By means of asymptotic analysis, we evaluate the performance of our proposal, determining the degree to which the distribution of COPs improves the execution time with respect to the sequential execution as the complexity of the dataset increases. Promising results are obtained.FABIOLA isolates the resolution of COPs from where the data is located. Our systematic framework facilitates the integration of different data sources, the selection of the inputs of the COPs, the definition of optimisation models, their execution, and querying the results.