In computer architecture, Gustafson's law (or Gustafson–Barsis's law) gives the speedup in the execution time of a task that theoretically gains from parallel computing, using a hypothetical run of the task on a single-core machine as the baseline. To put it another way, it is the theoretical "slowdown" of an already parallelized task if running on a serial machine. It is named after computer scientist John L. Gustafson and his colleague Edwin H. Barsis, and was presented in the article Reevaluating Amdahl's Law in 1988.
In contrast to Amdahl’s law, which assumes a fixed problem size and yields pessimistic scaling, Gustafson’s law assumes that problem sizes grow with available computational resources, allowing much greater effective speedup from parallel execution.