The volume of data needed for data engineering can vary depending on the specific use case and the type of data being processed. In general, data engineering is most commonly used when dealing with large volumes of data that are difficult to manage and process using traditional data management and processing techniques.
The exact threshold for what constitutes a "large volume" of data can vary depending on factors such as the type of data being processed, the complexity of the data, and the infrastructure being used. For example, a dataset that is considered "large" for one organization may be considered small for another.
However, as a rough guideline, data engineering is often used when dealing with datasets that are measured in terabytes (TB) or petabytes (PB) in size. For example, organizations that deal with large-scale data processing such as social media platforms and financial institutions.