Parallel programming became more commonly used in the early 1990s when the need for larger and faster solutions in physical sciences and engineering exceeded what could be done with individual existing computers. Scientists and engineers began connecting computers in intricate, dense networks and exchanging packets of data between them. What used to be one impossibly large computational problem became feasible because each computer could work on a small subset of the problem, and only communicate the data needed by other computers to proceed with their calculations. This gave rise to the message-passing style of parallel programming, later implemented as the Message Passing Interface (MPI) library. To date, MPI remains the de facto standard for parallel programming in Fortran, C, and C++ applications.