Multiplying Decimals
Suppose you're multiplying a decimal by a whole number, say .
This is the same as adding the decimal three times: . You can think of it as follows: If three friends each have cents, together, they have a total of cents.
It's a bit trickier when both numbers are decimals. Take the problem . The number is less than , so what does it mean to add up the first decimal times?
Remember that decimals are just another way of writing fractions that have powers of in the denominator. Multiplying a number by is the same as finding nine-tenths of that number. So you could rewrite the problem as
.
Then you would multiply numerators and denominators to get . This fraction is the same as the decimal .
Of course, you don't have to convert to fraction notation every time.
Standard Algorithm for Multiplying Decimals
First just multiply the numbers as if they were whole numbers . (Don't line up the decimal points!)
Then count the total number of places to the right of the decimal point in BOTH numbers you're multiplying. Let's call this number . In your answer, start from the right and move places to the left, and put a decimal point.
Example:
Multiply .
Step 1: Multiply the numbers, ignoring the decimal point.
Step 2: In , there is place to the right of the decimal point. In , there are . So, since , move in decimal places from the right in your answer.
You can check that this is reasonable. is close to , and is close to , so we expect an answer close to . And we got one!
Why does this work? Again, what you're really doing is multiplying fractions. means , and means . When we multiply these fractions, we get in the denominator, so the final answer is expressed in thousandths. When you add the total number of places to the right of the decimal points in the factors, what you're really doing is multiplying powers of ten in the denominators of the fractions.