What is the average speed, in miles per hour, of an object that takes 0.5 seconds to travel 60.5 feet? (Note: 1 mile [tex]$= 5,280$[/tex] feet)

A. 30.25
B. 61.00
C. 82.50
D. 121.00



Answer :

To determine the average speed of an object that takes 0.5 seconds to travel 60.5 feet, given that 1 mile equals 5280 feet, we can follow these steps:

1. Convert the distance from feet to miles:
- We know that 1 mile = 5280 feet.
- Hence, the distance in miles is calculated by dividing the distance in feet by the number of feet per mile.

[tex]\[ \text{Distance in miles} = \frac{60.5 \text{ feet}}{5280 \text{ feet per mile}} = 0.011458333333333333 \text{ miles} \][/tex]

2. Convert the time from seconds to hours:
- We know that 1 hour = 3600 seconds.
- Thus, the time in hours is determined by dividing the time in seconds by the number of seconds per hour.

[tex]\[ \text{Time in hours} = \frac{0.5 \text{ seconds}}{3600 \text{ seconds per hour}} = 0.0001388888888888889 \text{ hours} \][/tex]

3. Calculate the average speed:
- Speed is defined as distance divided by time.
- Therefore, we calculate the average speed in miles per hour by dividing the distance in miles by the time in hours.

[tex]\[ \text{Average speed} = \frac{\text{Distance in miles}}{\text{Time in hours}} = \frac{0.011458333333333333 \text{ miles}}{0.0001388888888888889 \text{ hours}} = 82.5 \text{ miles per hour} \][/tex]

Thus, after performing these calculations, we find that the average speed of the object is:

[tex]\[ \boxed{82.5 \text{ miles per hour}} \][/tex]