rabidsamfan: samwise gamgee, I must see it through (Default)
[personal profile] rabidsamfan
(or what happens when I'm having a week I don't want to think about:)

I can pretty much guarantee that whenever I get a young teenager in the library who is desperately trying to make sense of pre-algebra or algebra, a little discussion will show me that the child doesn't understand dividing fractions. (Given, of course, that the kid isn't still trying to count on fingers, but that's a rant for another day.)


Now, I believe that there are people who are "number blind", in the same way that people can be color blind. And for those folks, a thousand different explanations will only make a slight dent. They might understand for a while, but the comprehension won't stick for long. But for a lot of folks – kids and adults – the trouble is that the explanation which would make sense of dividing fractions hasn't been given to them yet. And they need an explanation, because dividing fractions is, as the present jargon puts it "non-intuitive."




We write fractions by taking two numbers, putting one above the other, and drawing a little line horizontally between them. We read them by saying things like "two over three" or "two thirds", always mentioning the top number before the bottom number.


There are two words you have to know in order to talk, write, or read about fractions. The number on the top of a fraction is the numerator, the number on the bottom is the denominator. (the little line has a name too, but I don't know it, I just draw the darn thing.) When you try to write a fraction with a keyboard, you end up with something that looks like it got tipped over like this: 2/3, where the numerator is the number on the left and the denominator is the number on the right.


It's all your teacher's fault.


When a teacher wants to explain fractions to the first graders she gets out a pie, cuts it into 3 pieces, takes one away and points to the remaining two and says, "Look! It's two thirds!" Your teacher did something similar, admit it, and you came away with the idea that fractions have to do with pies and pizzas and how to get more than your fair share of lunch. If you're lucky, you also came away with the idea that fractions are about dividing.


(I saw some of you freeze in the back of the room. It's okay. You can blink. We're not dividing fractions just yet.)


The next day, out come the manipulatives. Twelve blocks! And now, the teacher makes 3 groups of four blocks each and takes away one group leaving 8 blocks and says "Look! It's 2/3rds!" convincing half the class that teachers are even weirder than they seem and leaving the other half wondering how come this fraction isn't round or edible. After many many examples, and lots of playing with the blocks, most kids come around to the idea that fractions are about dividing something into pieces and then counting some of the pieces.

Once you've got the idea that a fraction is a part of a whole more or less fixed in your head, you go to the next step. Doing arithmetic with fractions. And if you're like most people you start out by doing what the teacher tells you to do and hoping it will make sense the way the pizza eventually did.


Adding and subtracting fractions is a pain in the patoot, because you've got to do that lowest common denominator thing (and the teacher keeps talking about apples and oranges and it still isn't lunchtime and now the teacher wants me to make fractions out of oranges and apples as well as pizza) but eventually most people figure out, with the help of blocks and money and pies, that since you can't get the right answer unless you do it the teacher's way you might as well do it.


Multiplying fractions is strange, but once you figure out that "of" means multiply and you play with the pizza and the pie and the money until you see that a half of a half is a quarter (and it's the money that really makes it clear because you have to share that half dollar with your sister) you can get by. Multiply the numerator of the first fraction by the numerator of the second fraction and the denominator of the first fraction by the denominator of the second fraction and you get an answer to your paper problem that you can represent with your manipulatives or money or pizza or pie pretty easily.


But when it comes to dividing fractions, suddenly pizzas and pies and blocks and money don't help. When you divide with a fraction, the teacher says that you should turn it upside down and multiply instead, and that just doesn't translate into pizza or blocks or money. Dividing means cutting things up! Putting things in separate piles! Taking a whole bunch of things and splitting them into equal groups!


Doesn't it?


Er... sort of.


But you can stand over a pizza with a knife for a really long time trying to decide how to make a cut that's going to represent what happens when you divide one fraction by another fraction. If you're used to coming to understand an abstract concept in math by playing with concrete examples, dividing fractions is the first place you're likely to find yourself truly stuck.


So, let's go back a step.


A fraction isn't just a piece of something. In math, a fraction is a division problem, written in shorthand. The numerator is being divided by the denominator. A fraction is also a ratio, odds, a percentage, a decimal number and a proportion. And because it can be all of those things, it's one of the most incredibly useful tools in math. Unfortunately, because it's a division problem, it bangs up against one of the other most incredibly useful tools in math.


Nothing.


Yup. That's where the trouble is. Nothing – or, as we usually say: Zero.


Remember your algebra teacher saying "You can't divide by zero!" as they slashed through your careful calculations with a red pencil? Betcha they never told you why.


You see, math is a game, like Monopoly. And you never divide by zero for the same reason you can't collect 200 dollars if you don't pass Go. It's against the rules.*


WAIT! Whaddaya mean math is a game? What about my pizzas? What about my manipulatives? What about my MONEY!? Everyone always told me I should care about math because I'm going to use it in the real world every day!


Everyone lied.


No, no, they didn't lie, they just didn't tell the whole truth. Math corresponds to the real world most of the time, but only most of the time. Try packing 68 children into 3.4 school buses with a capacity of 20 children each and you'll understand why you should have asked for four buses instead of taking the number straight off your calculator.


The people who invented numbers and started the game began with easy stuff like addition and subtraction, where you could always, if you had enough blocks, represent the numbers with objects and count to see if you got the arithmetic right. It became obvious fairly quickly that you could also check your arithmetic by undoing what you'd just done. If two plus three equals five then five take away three equals two.


Being able to undo things was so glorious that it became a hard and fast rule of the number game. You can only do something, the rules said, if you can undo it too. And with addition and subtraction it works! Even with nothing! If you add nothing to five you get five and if you take nothing away from five you get five again.


And then some clown invented multiplication. Adding in groups! Faster, easier! Take five groups of four and multiply and whammo, you've got twenty and you haven't even had to take your shoes off! And you could undo it! Take twenty and divide it by five and you've got four! Useful! You could share out sheep between your sons fairly without going through the long process of taking one sheep to each son in turn until you ran out of lambs. The dancing could commence!


And then it happened. Some spoilsport pointed out that if you multiplied five by nothing you got nothing. And there was no way to undo it and get back to five. The incredible usefulness of being able to undo things slammed right up against the incredibly useful idea of nothing (zero hadn't been invented yet) and nothing won. We needed to be able to multiply by nothing. We had to have a way to say that there were no groups at all.


And at the same time, we had to have a way to undo multiplication all the rest of the time. Dividing made instant sense of sharing things fairly as long as you didn't mess with that pesky nothing.


So the number game people made an exception to the rules. You're allowed to multiply by nothing. But when you do, you're stuck. You can't do it the other way around.


And as a result, of the four major operations in arithmetic, Addition, Subtraction, Multiplication and Division, it's Division that has the short end of the stick when it comes to nothing and that causes all the trouble. You can't divide by zero. The game won't let you.







So what has this got to do with Fractions?


Remember I said fractions are division problems? You take advantage of that every time you convert a fraction to a decimal. (In fact, I think decimals were invented by someone who really hated fractions.) But Division, unlike the other three operations, isn't playing with a full deck of numbers and that means Trouble. When you take a division problem and divide it by another division problem, you're taking Trouble and giving it cream pies to throw.


Fortunately, fractions can also represent proportions, or relations between two things. If you're going to get given two lambs for every three sons, you've got a problem, but you can also figure out how many lambs you're going to get given if you have eighteen sons.


And it's when we think of fractions as proportions that we can finally find a concrete way of understanding how dividing fractions works.


Put away the pizza cutter, and go back to the pizza recipe. You've got a list of ingredients for the crust dough, right? So much flour, so much salt, so much water, etc. If you want to make twice as much pizza as the recipe calls for, you use twice as much flour, twice as much salt, twice as much water, and so forth. And if you want to make half as much pizza you use half the flour, half the water, and half the salt and so forth. In either case, the proportions of flour to salt to water stay the same and the crust should be edible. Congratulations, you divided by a fraction, right?


Except you didn't.


(Say WHAT?)


Okay, okay, I admit it, that's confusing. But you didn't divide. You multiplied, both times. The first time you multiplied by two, and the second time you multiplied by one half. (Remember "of"? One half of X is a multiplication problem.)


But you can undo what you just did, right? No zeros to worry about here. So in the first instance you divide by two and in the second instance you divide by one half and both times you should get back to where you started.


Let's take a closer look at that second instance with some numbers attached.


One cup of flour plus ½ cup of salt plus ¼ cup of water makes two pizzas. I want one pizza so I multiply each amount by one half and get ½ cup of flour, plus ¼ cup of salt plus 1/8 cup of water. The proportions are the same. To reverse that I want to divide by ½. I can see the answers waiting for me – all I have to do is multiply by two! And that's where the idea of inverting the second fraction comes from. The recognition that in order to get back to where I started, I can use the number game to turn a division problem into a multiplication problem, just by flipping the fraction over. Some genius saw it, tested the theory with more and more complicated fractions, and sure enough it worked every single time. As long as we turned the fraction we wanted to divide by upside down and multiplied we got back to the place we wanted to be, so that's what we've all done ever since. Even when we didn't do the multiplication problem that we're reversing in the first place.


And that's why manipulatives and pizzas confuse us when we want to understand dividing fractions. Because we only figured out how to how to divide by fractions once we figured out how to undo multiplying by fractions. And you can't represent that in the real world unless you've done the multiplication first and you've kept the extra supplies lying around to let you put things back together.


So if you're a concrete thinker, and you want to play with dividing by fractions, get out at least two substances which you can split up multiple ways, like water and flour, create a proportion (which is a fraction) by weight or volume between the two substances and then multiply it by a simple fraction like one half or two thirds. Write out the work.


Then play with undoing what you just did, both on paper, and with the materials in front of you. Do it a bunch of times, making sure to see what the numbers do on paper as well as what the stuff does in your hands. Compare the weights/volumes of the materials you got when you multiplied to the material you set aside and see what proportion of the original total is represented by the setasides.


And when you're done, come back and tell me if this helped you understand dividing fractions at all. Because you know, I'm going to have to explain it to someone again pretty darn soon...


(I'll explain why this works tomorrow, but you're going to have to put up with a lot more jargon.)

*(I borrowed this analogy from Kenn Amdahl.)
Page generated Mar. 11th, 2026 11:28 pm
Powered by Dreamwidth Studios