Answer :

Answer:

A. To scale the output of each layer

Explanation:

The batch norm normalizes the output of a previous activation layer by subtracting the batch mean and dividing by the standard deviation