To the Limit…One More Time

There’s an interesting article in this month’s Mathematics Teacher about the effects of the particular language elements we use to communicate mathematical ideas.  The main thread revolves around limit concepts, primarily because they’re both philosophically and practically confusing for many beginning calculus students, and because, it turns out, a teacher’s particular choices regarding words and metaphors have an important impact on student (mis)understanding.

Limits comprise a special relationship between mathematical process and mathematical object.  We speak of them in terms of variables “approaching” or “tending toward” particular values, but we subsequently manipulate them as static entities.  I can, for instance, talk about the limiting value of the expression 1/x as x grows without bound (a dynamic concept), but that limiting value is ultimately just a single real (static) number: zero.  There’s an uncomfortable tension in that duality.

Even the notation is ambiguous.  Here’s the fact I mentioned in the preceding paragraph, symbolically:

\lim_{x \to \infty} \frac{1}{x} = 0

The arrow implies motion, but the equals sign implies assignment.  There are elements of both process and object.

I’ve touched on this duality before, which has sparked some great conversations.  A few months ago, I had a supremely interesting email chat with Christopher Danielson after he pointed me toward the writings of Anna Sfard.  He has graciously agreed to allow me to reproduce that conversation here in its original form; I’ve only redacted some of the more boring pleasantries and collapsed some strings of shorter messages into longer ones.  Enjoy.

Chris Lusto
To: Christopher Danielson

Seriously, thanks for the Sfard tip.  I’ve read a few of the articles she has on her website (which, by the way, why are college professors’ websites like the most aesthetically displeasing things on the internet?  Just use a white background and stop being weird.), and you were right: I dig her.  I read the article on duality [PDF] and had one major bone of contention.

I really like the idea of duality versus dichotomy, and she makes, I think, a compelling argument in general.  I just worry it might just be a little ambitious.  She hedges a little bit, saying things like “more often than not” mathematical objects can be conceived both operationally and structurally, but I still think this idea of duality runs into serious problems when infinite things come into play–and that’s not exactly a trivial subset of “mathematical objects.”

If we allow that operational conception is (a) just as valid/important as structural and (b) often, in fact, precedes structural conception, what are we to make of processes that never end, that never produce anything because they’re always in production?  Sfard even says, “…interpreting a notion as a process implies regarding it as a potential rather than actual entity, which comes into existence upon request in a sequence of actions.”  But what if we can’t ever fulfill the request, because we’re always on hold, waiting in vain for the end of an unending sequence?  And what about this business of “potential?”  That just smacks of the “potential infinities” of the ancient Greeks that held back western mathematics for a couple millennia.  It seems like we have to admit either (a) an infinite process can terminate in finite time in order to produce an structural object, or (b) these objects aren’t really at all structural, because they live in the world of potentiality.  I don’t find either of those particularly satisfying.  I think, in the case of infinite notions, the operational conception leads to a fundamental misconception, a la my student D.

Your thoughts?  Whenever you have a moment, of course.


Christopher Danielson
To: Chris Lusto

“Ambitious” describes Anna Sfard’s intellectual habits very well, I think. She was in a half-time appointment at Michigan State (and half time at Haifa) for part of my grad school time, and she was on my dissertation committee. The woman is crazy smart. And it seems to be a characteristic of Israeli intellectuals to commit very strongly to one’s ideas. Not a maybe or a perhaps to be found in her oeuvre, I don’t think.

I have no explanation for the poor poor quality of academics’ websites, except to say that it is representative of tech use in higher ed more generally. See also @EDTECHHULK on Twitter and Dan Meyer’s comments here (esp. couple screens down the page, at “Real Talk about Grad School):

I’m still formulating thoughts on processes that never terminate. But I’m not sure I fully understand your objection. Your classroom scenarios seem to suggest that indeed process and object are both fundamentally important ways of thinking about infinity. And consider the language of limits…”as x goes to infinity” or even “as x grows without bound”. Those are both process-based ways of talking, right?


Chris Lusto
To: Christopher Danielson

I think Sfard’s right that, in general, process and object are both important methods of mathematical conception.  And yeah, multiple representations are not only admissible, but probably desirable (thinking here, specifically, of HS algebra and the Lesh Model), but isn’t operational understanding misleading when you’re talking about infinity?

Thinking of f(x) = 2x as a process that doubles inputs is valuable, and so is a picture of the resulting object/graph.  And, in a case like this one, I don’t think you lose or gain all that much with either vantage.  Sometimes it’s helpful to think of the process, and other times the object.

But thinking of asymptotic behavior procedurally, for example, is very, very different from the object we call a “limit.”  It’s nice if students can understand that, as x gets larger, 1/x gets arbitrarily close to 0.  I mean, certainly if we hold a numerator constant and increase the denominator, this process yields subsequently smaller and smaller values.  But I think that’s still like a mile away from understanding that lim_x–>∞ {1/x} = 0.  Like, is equal to.  Is identical to as an object.  Is just another name for.  Like, 23 + lim_x–>∞ {1/x} = 23.

If procedure (process) is linked to product (object)–like, say, “4 divided by 7” is linked to “4/7”–then how are we to reconcile a never-ending process with a finite, tangible product that can be manipulated like any other mathematical object?  Doesn’t it force us to accept that 1/x eventually “gets to” 0 (which it doesn’t), or that the limit is some kind of potential result (which it isn’t) that can’t really ever be called a proper object because the process is, by definition, never-ending?

I’m going to stop typing words, because I feel like as my words –>∞, my clarity –> 0.


Christopher Danielson
To: Chris Lusto

I see…so to boil it down to a debatable question…

Is the object necessarily the product of the process?

Do I have it right?

btw…if I got that question right, then I say ‘no’.

I can think about 1,352,417 and treat it as an object, even though I can assure that I have never participated in any sort of process that yielded that number.

To say nothing of googolplex.


Chris Lusto
To: Christopher Danielson

I think that’s about right, but with one important qualification.

Is the object necessarily the product of the process?  Then I agree, no.  But you at least have the option of defining it either way.  Even if you’ve never constructed 1,352,417 widgets, there’s nothing philosophically problematic with the process that did/could.  You’re right, there isn’t even a measly googol of anything, but that doesn’t stop it from being the eventual result of (1+1+…+1).


Is the object the result of the process?  Not necessarily, but that’s not a huge problem for me.

Could the object be the result of the process?  If the answer is no (which my gut believes it to be in the infinite case), then how can we reasonably talk about it as both a process and an object?  Does the duality break down?


Christopher Danielson
To: Chris Lusto

See I don’t see a huge difference philosophically between “a product that could be created by a known process, but not in my lifetime” (counting to googol) and “a product that could never be created” (infinity).

In both cases, for me, the process is (1) incomplete, and (2) hypothetical.
Why does it matter at the core whether the result is theoretically achievable or not? Either way, I’ve imagined it.

And I think imagination is key. I don’t recall whether Sfard writes about that or not (probably not, since she’s all language, no imagery). But I do think the transition from process to object is at least in part one involving imagination. I have to imagine the object into being in mathematics precisely because mathematical objects are abstract.

And when I’m struggling to understand a new object (say a limit), it is often helpful to imagine the process that produced it. But I don’t have to see the process through to the end.


Chris Lusto
To: Christopher Danielson

Think about our Hz conversation.  Even with arbitrarily huge numbers of wave combinations, we get sinusoidal waves.  I can get as close to a square wave as I want, but in order to actually obtain the square wave object, the process that got me arbitrarily close to my goal breaks down and fails.  The process is insufficient to the object.  The difference between the square wave and the sinusoidal wave that’s arbitrarily close to square is ultimately qualitative, not just quantitative–and there’s the rub.  Wasn’t that precisely what you and Frank [Noschese] convinced me of?


Christopher Danielson
To: Chris Lusto

But the square wave is the limit. There’s the object. The limit (process? object?) produces the square wave.

I have no idea what I convinced you of. But I know that the argument I was making was that polynomials-by definition-have finitely many terms. And e^x can be written as infinitely many terms, each one a polynomial. Is e^x a polynomial? By the letter of the law, no. But in spirit? Yes. And that’s beautiful.

I got in trouble doing a CMP demonstration lesson once. I talked with students about a cylinder being a circular prism. The algebra teacher observing got upset with me because a prism has polygonal faces. Ergo, “circular prism” is nonsense.

I had occasion to follow up a year or so later with my former complex analysis professor from MSU grad school. He had absolutely no problem calling a cylinder a circular prism.  No problem at all.

What to learn? Unclear.


Chris Lusto
To: Christopher Danielson

I see a huge distinction between “unachievable due to resource constraints” and “unachievable by definition.”  Why is the possibility that CERN moved some particles faster than light a big deal?  We’ve already moved all kinds of stuff 99.999999% that fast in the lab.  The extra .000001% is practically trivial, but philosophically enormous.  It’s not that faster-than-light travel seemed to be practically impossible, but literally, probability exactly 0 impossible.

The difference between almost 0 and 0, no matter how small, is mathematically gigantic.

This is seriously all kinds of fun, but I have to go do some domestic things.  To be continued…in finite time.


Christopher Danielson
To: Chris Lusto

That’s the beautiful thing about email. It is at heart an asynchronous medium.

By the way, some would say that you have pointed to an important difference between mathematics and the sciences with your example.


Thanks so much to Dr. Danielson for (a) having this discussion, and (b) letting me publish all the gory details.  Oh, and (c) making me smarter in the process.


10 thoughts on “To the Limit…One More Time

  1. This discussion reminds me a lot of the Theory of Knowledge discussions that the IB curriculum tries to foster.

    • As they should. This is good stuff. If you’ve read more than like one of my posts, you already know that Danielson is both a generous commenter and a seriously smart dude. If you don’t already read his blog, click on his name in the post and check it out. Lots more gems in there.

  2. This is such a rich discussion. I love it. A couple thoughts come to my mind, and I’m hoping you people can help me get straight on this, because I’m having trouble understanding it:

    What’s wrong with this sentence of mine?

    “A hamburger can be conceived of both operationally and structurally. You can either think of a hamburger as being a finished thing, or as the result of cooking a bunch of meat.”

    If I’m reading Sfard right, though, this seems entirely parallel to what she says about numbers:

    “A rational number can either be conceived of as a pair of integers or the result ofthe division of integers.”

    I bolded “or the result of”, because that’s where I think that she’s cheating. Notice that you don’t have to use “or the result of” to explain the duality of functions.

    “A function can either be conceived of as a table or as a computational rule.”

    But I worry about even functions. Because, what is a computational rule? It’s easy to say that it’s a bit of instructions for a user to follow when the rules are simple. So when a function says “double this input” it’s easy to think of the function as a procedure to follow. But then what about truly arbitrary functions, like {(3, -234), (4,5), (0,0), (100, 2), … } ? Does that mean that there’s some procedure that I’m following?

    Clearly not, or at least not anything that we’d normally call a rule. That’s why, when I teach my students functions, I find it helpful to make a distinction between functions and rules. Rules are the sorts of things that tell you what to do to inputs. f(x) = 3x is a rule that says “triple your thang.” But functions are tables. Not all functions have rules, and not all rules make functions. So even in that case, it seems that it’s important to separate the dynamic/procedural and static understandings of function for the learner.

    When it comes to something like limit, my instinct is to similarly separate the two understandings. Fundamentally, a limit is something static. Sure, it’s the product of something dynamic, but so can lots of things.

    • I don’t think there’s a single thing wrong with your hamburger sentence. In fact, I think it embraces Sfard’s duality nicely. The process of making a hamburger and the hamburger object that is bought and sold and eaten are inextricably linked, particularly in the context of learning new concepts. Think about how you would explain hamburger to a foreigner. “You take some beef, run it through a grinder, mix it with some binders and seasonings, press it into a patty, grill/fry it up, and stick it on a bun.” Of course you wouldn’t be completely satisfied with that; you would, immediately afterward, take your friend to Five Guys and say, “This, this right here is a hamburger.” Duality, not dichotomy.

      And, when you replace hamburger with function, you and I begin to differ. I think you definitely lose something if you strictly think of functions as tables, as static objects. You’re right that not all rules make functions, of course, but I contend that every function is a procedural rule. True, some rules are terser than others, but that’s an information-theoretic distinction, not a mathematical one. The table of values (1,3), (2, 6), (3,9) … is highly compressible as “triple your thang.” It has very low entropy, which means I can generate it with a short program. Your “truly arbitrary” function is still a procedural rule, but now it’s a much longer program: “If your thang is 3, output -234; if your thang is 4, output 5; … ” Even if you need one line per domain value, that’s still an executable procedure, just not a compressible one. It’s a quantitative difference only.

      Plus, I think you lose some powerful cognitive tools if you ignore the dynamic half of functions. Forget “triple your thang” for a minute and let’s imagine how f(x) = 3x acts on the real line. It puts a pin in 0 and then stretches the whole thing so that every point is 3 times farther away than it used to be. If f(x) = x + 3, it drags the real line to the right a little bit. If f(x) = 0, it collapses the entire thing into a single dimensionless point. That’s beautiful. This line of reasoning opens up the possibility for students to develop strongly geometric intuition about functions as well, which is nice not only for the present, but almost a necessity if they ever hope to tackle functions of a complex variable. Want to look at a table of values for f(z) = z2? Me, either. I want to look at the complex plane being twisted around on itself.

      Anyway, that’s my take. Oh, and re your last comment, I agree that a limit is fundamentally static. And not at all the product of something dynamic. The schism I reject in your function interpretation is exactly the one I think exists in the case of infinite processes. Functions can be both dynamic and static; with limits, I think we nurture weird and serious misconceptions when we appeal to dynamic metaphors. They break down. That’s essentially where Danielson and I launched the discussion. He’s not convinced.

      • I don’t think there’s a single thing wrong with your hamburger sentence. In fact, I think it embraces Sfard’s duality nicely.

        But doesn’t that mean trouble for Sfard’s thesis? I read Sfard to be trying to give a framework for explaining what makes the acquisition of mathematical concepts so difficult, and her answer has to do with the supposedly unique duality of mathematical conceptions.

        If the hamburger example is right, then this isn’t about math. This is just about things. And there’s nothing specifically interesting about this duality. If I want to understand bricks, I should take a look at a brick and I should also figure about how you make a brick. If I want to understand goats, I should take a look at a goat and also figure out how you make a goat. If I want to understand rational numbers, I should take a look at a rational number and what they’re used for, and also take a look at where they come from.

        And it’s not about ontology. It’s about pedagogy (and if you want an extra syllable or two, epistemology). It’s about what it means to understand a thing, and how you go about gaining that understanding. That was the issue that I was having with Sfard’s paper.

        I contend that every function is a procedural rule.

        Yeah, I agree. But it’s not natural for the beginning learner. So what are we talking about, exactly? Ontology? Epistemology? Pedagogy?

        I argued above that this duality shouldn’t be framed as ontology. As epistemology, you might be right — I’m not sure. Ultimately to understand functions maybe they must be known as both procedural rules as well as tables. But what about pedagogy?

        The first time I taught functions I told kids that functions were rules, and they told me that arbitrary tables weren’t functions, because they didn’t follow a rule/procedure. So the next time I taught it I made a distinction — functions are complete tables, but you can follow a rule to produce a function.

        Clarity isn’t always better than duality. Graphs, equations and tables are all ways of representing the same underlying relationship, and I don’t want those separate. But I’m not sure what I lose by clipping functions and rules. The only thing that I can think of is that I’m keeping my students from having a very, very abstract sense of what a rule is, and I think that’s OK.

  3. Ed Dubinsky has some interesting things to say about how the process/object duality fits into how we learn mathematical ideas. (For instance His basic thesis is that when we learn a new mathematical thing, we first conceive of it as a process or the result of a process, and we later get the idea of the thing as an object–and you have to start thinking of it as an object before you can move on to doing more stuff with it…

  4. Pingback: Infinite Problems « Recipes for Pi: Making Smart Cookies

  5. Michael: I should have said that there’s nothing wrong with your hamburger sentence as an analogy for Sfard’s thesis. I think the salient difference is that, in the case of a hamburger, both the object and process are concrete. I can be presented with a hamburger; I can watch it being made. Not only can those two things be stored independently in my brain, but they can be presented to me independently as I learn about them. In my own personal case, I’m pretty positive I was choking down ground chuck before I had even the faintest grasp of its provenance, and probably before I even connected hamburger and cow in my tiny brain.

    We just don’t have that luxury in mathematics, where the objects and processes are both abstract. I can’t really show you 4/7 (at least not in all its glory), and I can’t really show you 4 ÷ 7, either. Sure, I can manipulate some concrete objects as proxies for those mathematical constructs, but that’s not quite the same thing. Four apples are not the number four. They’re not even an instance of the number four. They’re just a group of objects that happen to be imbued with fourness—whatever that means. And once we free mathematical objects from referents, things get even hairier. So we have these abstract objects, produced by/paired with/linked to abstract processes, which all seems bad enough, and then we throw infinite processes into the mix. And that, I contend, is where the duality gets strained to the breaking point.

    And that is my pedagogical point (and the MT article’s). All philosophy aside (maybe too late for that…) I think the fact that we appeal to procedural metaphors for procedures that, by definition, never end, yet somehow manage to produce finite, well-behaved results, is extremely confusing. Otherwise, I agree with your general sentiment that while this kind of picayune lawyering is important and fun for the mathematically inclined, our goal isn’t to produce kids that can opine on the ontological/epistemological nature of functions, so some convenient shortcuts don’t constitute malpractice.

    Ed: That’s similar to Sfard’s stance, although she’s a little more careful about the interplay. I don’t think she’d say that procedural understanding necessarily precedes structural understanding, but it certainly works out that way in many cases. If I weren’t so lazy, I’d quote her to that effect.

    Thanks for your comments!

  6. I confess I haven’t read the MT article that sparked this conversation, but it reminded me so much of two really neat books that I wanted to stop by and share them:

    1) Where Mathematics Comes From, by George Lakoff and Rafael Nunez. Especially the section “The Embodiment of Infinity” which talks about just this question — how do we make sense of infinity using the metaphors we have access to about ongoing processes? And how do the ways we make sense of infinity change the mathematics we do? They discuss “granular numbers” and the hyperreals and worlds in which, for example, 0.99999999… does not equal 1.

    2) How Mathematicians Think, by William Byers. He writes about the fundamental importance of just such ambiguities in creating new mathematics. The process/product duality is an ambiguity, and one which we mostly cope well with (e.g. 4 ÷ 7 and 4/7), but it gets extra ambiguous when dealing with infinite processes. And so infinity is a concept which is inherently ambiguous or dual: it holds two separate, complete frames of reference in tension, and new mathematics comes when you reconcile them.

    Happy Summer Reading,

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s