I have another question on Atrickplays study guide that doesnt make sense to me:
If a driver completes a trip of 120 miles at 30mph. At what speed would the return trip need to be driven in order for the average speed of the trip to be 40mph?
--The answer given uses formula of d=r*t and comes to 60mph.
Maybe I am underthinking this but if the return trip is assumed to also be 120 miles, wouldnt 50mph on the return equal an average of 40mph?
If a driver completes a trip of 120 miles at 30mph. At what speed would the return trip need to be driven in order for the average speed of the trip to be 40mph?
--The answer given uses formula of d=r*t and comes to 60mph.
Maybe I am underthinking this but if the return trip is assumed to also be 120 miles, wouldnt 50mph on the return equal an average of 40mph?