Montaigne on the Education of Children

“The greatest and most important difficulty in human knowledge,” Montaigne says, “seems to lie in the branch of knowledge which deals with the upbringing and education of children.” That seems right; and yet it’s hard to argue that we’ve solved the difficulties.

The problems Montaigne diagnosed with education in his day, almost five hundred years ago, are really no different to the problems we still see today. He pleas for an education system that focusses on the individual, even going so far as to advise the person to whom his letter is addressed to not send her son to school, but to instead find a full-time private tutor. Our education focusses so much on the masses that it fails to give anyone a real education:

“If, as is our custom, the teachers undertake to regulate many minds of such different capacities and forms with the same lesson and a similar measure of guidance, it is no wonder if in a whole race of children they find barely two or three who reap any proper fruit from their teaching.” 

What is the ultimate point of our education? We debate that question keenly, but for Montaigne it was clear: “The gain from our study is to have become better and wiser by it.” By this he means understanding or a kind of judgement that informs thought and action. Memorisation is the enemy of understanding:

“It is the understanding… that sees and hears; it is the understanding that makes profit of everything, that arranges everything, that acts, dominates, and reigns; all other things are blind, deaf and soulless. Truly we make it servile and cowardly, by leaving it no freedom to anything by itself. Who ever asked his pupil what he thinks of rhetoric or grammar, or of such-and-such a saying of Cicero? They slap them into our memory with all their feathers on, like oracles in which the letters and syllables are the substance of the matter. To know by heart is not to know; it is to retain what we have given our memory to keep.”

Memorisation is unrelated to education, for an education properly understood must be about understanding and judgement. And yet our schools continue to teach to tests, and tests require almost nothing but memorisation. This recalls Seneca’s lament that “We learn not for life, but for the schoolroom.” Likewise, when studying history, our schools focus on the irrelevant parts that are easily taught, and not on the essence of how what we learn could inform our lives:

“But let my guide (the teacher) remember the object of his task, and let him not impress on his pupil so much the date of the destruction of carthage as the characters of Hannibal and Scipio, nor so much where Marcellus died as why his death there showed him unworthy of his duty. Let him be taught not so much the histories as how to judge them.

Montaigne makes what is today a most controversial argument, arguing that science should be left entirely aside until students have acquainted themselves thoroughly with the philosophy of how to live. The common logic today is that students should prepare themselves with technical skills first, and learn about life later; but Montaigne entirely reverses this:

“It is very silly to teach our children ‘What effect have Pisces and Leo, fierce and brave,/Or Capricorn, that bathes in the Hesperian wave,’ the knowledge of the stars and the movement of the eighth sphere before the knowledge of themselves and their own movements.”

It is an argument for the humanities: that our first task in education is to come to know ourselves, so that we can then devote ourselves to a vocation once we are sure on the direction we wish our life to take. The sciences are a luxury; if we don’t know how to live, there’s no point in thinking about them. Montaigne argues, again following Seneca, that the reason so many people leap straight to vocational training before having learned how to live is because they misunderstand philosophy. Philosophy has been confused with complex constructions of logic (and philosophers are mostly to blame for that), when its essence is how to live.

I think all too often we feel the problems Montaigne diagnoses—the rote learning, the mass production that education has become, the sense that we leap into a career before we truly know ourselves—but are inclined to put these down to modern education. His is an important reminder that formal education throughout the ages has changed but little, with students, teachers, parents and public figures all concerned about the same things, but with entirely no idea what to do about it on a system-wide level. If anything, Montaigne demands that we—as students or as parents—take responsibility for our own education and the education of those around us, limiting whatever harms are done, and guiding towards a lifelong ability to learn in order to understand.


The edition I’ve quoted from is The Complete Essays of Montaigne from Stanford University Press, translated by Donald Frame.

Can New Zealanders and Australians Afford To Study at a US University?

When I’m asked about how one goes about studying at a US university, or at least one modelled on the American higher education system, I’m usually first asked something along the lines of: is “college” the same thing as “university”, and what is a liberal arts degree? I decided to start writing up my responses to these questions I get asked all the time, and I answered that first general question in my article A Guide for Non-Americans: What is “college” and how does it differ from university?

The second question people ask is often not even phrased as a question. It goes something along the lines of, “Oh, there’s no way I could afford to study in the US. It’s too expensive, and I’m not that rich.” The question embedded in that is, how do you afford it? Did you get a scholarship, or are you simply very wealthy?

The answer, to both of the above questions, is no. Well, mostly no. I have received a scholarship to study at Yale-NUS and for my time at Yale, but there’s still no chance I could afford to study there if that was the only financial support I’d received. So here’s a guide to how it works, and you should, for the most part, find reason to be pleased: if you are committed to working incredibly hard for a few years to gain admission to a top US university, the finances will work themselves out. Really.

If one of the first things you’ve done is looked at the fees listed on US college and university websites, most of us would indeed have reason to close our browsers, run a mile from the computer, and never again consider studying in America. Yale, for instance, says that the base cost of attendance for one year for an undergraduate in 2016-17 is USD$68,230. I emphasise: United. States. Dollars. At current exchange rates, that’s NZD$98,000 and AUD$93,000. Of those fees, USD$50,000 is the tuition cost, and the remainder is for room, board, and other expenses. I’m taking Yale as my example, but the numbers are really very similar for most universities as an international student (we aren’t eligible for any subsidies).

But unfortunately for those of us from countries outside the United States, things get even more expensive. Roundtrip flights from Wellington, New Zealand to New York, for instance, come to roughly NZD$2,500, and you’d be looking at doing that flight twice a year. The reality is you’re unlikely to stay in your room on campus for the entire semester, so you’re going to need money to cover other living expenses, maybe a couple of thousand per semester. Exchange rate differences can come to be truly scary—some of my friends have seen their cost of attendance double or even triple in the course of a year depending on how the exchange rate swings.

To put it simply: the sticker price for a year of study at a US university is going to be over $100,000, whether in Australian or New Zealand dollars. For a degree, then (four years at US universities), it’s going to come to roughly half a million NZ/AU dollars.

Before I get onto the details of how that amount is very rarely what you would be paying, here’s one brutal reality: only the top universities have the financial resources to subsidise the cost of your education. The middle band of US universities—the ones which in all likelihood you haven’t heard of—will not provide financial assistance to international students. This means that unless you can afford the full price of the education as above, you need to gain admission to one of the top universities to have your education subsidised. The Ivy League.

And what of the Ivies, the other top small liberal arts colleges, and international universities like Yale-NUS College and New York University in Abu Dhabi and Shanghai? What do they do differently? The key term you want to know is financial aid.

Here’s a statement taken directly from Yale’s page on financial aid:

“Yale admits students without regard to their ability to pay and meets 100% of demonstrated financial need. For all students. Without loans.”

Financial Aid at US UniversitiesRestated, this means that if you are admitted, Yale will then look at your family’s financial situation and make an offer of financial support that will make it feasible for you to attend. It’s not about taking out a loan. It’s simply a subsidy on the total cost of attendance, and the subsidy can vary from a few thousand dollars to 100% of the cost. Yale also states that “Families whose total gross income is less than USD$65,000 (with typical assets) are not expected to make a contribution towards their child’s Yale education. Over 10% of Yale undergraduate families have an Expected Family Contribution of $0.” In 2015-16, the average financial aid award amount was USD$43,989; in other words, the average award cut fees by almost two thirds.

I’m taking Yale’s statements here as examples, but most other top universities offer almost identically-worded policies. The immense endowments of these universities make these generous financial aid policies possible, where other universities are simply unable to offer them.

It’s disappointing that many immensely talented young New Zealanders don’t bother applying to top international universities each year simply because they assume it will be too expensive. I’ve had friends and acquaintances who had dismissed the idea of international study from the start because their parents had told them not to even think about it, without knowing about financial aid.

The hard part is getting in. It is hard, but not impossible. If you have the brain and the work ethic, gaining admission should be your only focus; you should not let concern of finances stand in your way.

If you haven’t already, I do encourage you to read my post on liberal arts colleges in the US. US higher education is unique for its focus on the liberal arts, which offer students an opportunity to figure out what you should do, before then going on to learn how to do it for postgraduate study. That’s very, very different to what our universities in this part of the world offer, and it’s an idea that I think we should take far more seriously.


Some links to additional information and examples are below.

Harvard’s financial aid information

Yale-NUS College’s financial aid information

The University: An Owner’s Manualby Henry Rosovsky

A Guide for Non-Americans: What is “college” and how does it differ from university?

An introduction to American higher education and the liberal arts for New Zealanders and Australians considering tertiary study in the US

Applying as a New Zealander to study for university in the United States was a daunting process. There were so many things I’d never heard of: the SATs, the Common App, “safety schools”, “reach schools”, financial aid, liberal arts, to name just a few. But I think what made the process most daunting right from the start—and what made it difficult to find the right information—was that I didn’t understand the fundamental terms to describe the university itself in the American system.

Americans do not go to university for undergraduate study. They go to “college”. In New Zealand everyone talks about going to “uni” straight out of high school, and we’ll be choosing between doing law, medicine, commerce, engineering, architecture, and other vocational degrees. But when you go to American universities’ websites to look at studying those things, you’ll soon see that all of them require a “college” degree as a prerequisite. And no, that doesn’t mean a high school diploma, despite us in NZ using “college” to refer to high school. This can all be overwhelming and can make little sense for those of us in countries that follow a roughly “British” model of education, and it’s something I get asked about often. So here’s a brief guide.

The first thing to know is that in the American system of higher education (that’s what we call “tertiary” education), you can only complete a Bachelor of Arts or a Bachelor of Science for your undergraduate degree. That’s right—no BCom, MB ChB, LLB, BAS, BE etc. In the United States, your undergraduate degree will only be a BA or a BSc, with honours, and it will take you four years to complete. (There are some exceptions to this rule, but they are in a very small minority). The “honours” part is usually not optional, as it is in Australia or New Zealand—if you get into the college, you’re going to do the honours part of the degree. This is why you’ll often hear many top schools in the United States called “honours colleges”, because simply by being admitted, every student is doing an honours program—it’s not something you apply for in your third year of study depending on how well you’ve done so far, as it is in New Zealand and Australia.

The term “college” itself comes from the fact that traditionally, higher education in America was conducted at liberal arts colleges. There were no universities. Yale was Yale College, not Yale University. Harvard was likewise Harvard College. These were usually small institutions at which students spent four years (as they do today) and completed a BA(Hons) (as they do today), and what made them truly unique was that all students lived on campus accommodation for all four years of their education (as they do today). Very often, professors lived and dined with students as well (as they often still do today!) So “college” is a physical place where you live and study for four years while completing your undergraduate education.

Universities later grew up around the colleges and began to offer professional or vocational degrees, which in the American system can only be studied at the postgraduate level. Yale University today, for instance, is now comprised of fourteen “schools”—one of which is Yale College, the only part of Yale to offer an undergraduate education. The other thirteen schools all offer postgraduate degrees, and twelve of them offer professional/vocational degrees—law, medicine, architecture, business, and so on. And yes, you first need an undergraduate “college” degree (or an equivalent undergraduate degree from another country) to gain admission into one of the “schools”.

So “colleges” are now often small parts of overall universities in the United States, but they remain an important centre for the whole institution. Sometimes, the original colleges declined to set up larger universities around the small college—these are now usually referred to as “small liberal arts colleges”, where the whole institution still only offers a BA or a BSc with Honours. Examples are Pomona College, Swarthmore College, Amherst, Williams, Wellesley, Haverford, Middlebury, Carleton, to name some of the better known ones.

One other thing to note is that because you live on campus in the American college, you will be placed in a specific residential college for your four years of study. While studying at Yale College, for example (itself one of the fourteen schools of Yale University), I was placed into Branford College, one of the fourteen constituent residential colleges that make up Yale College. Each of the fourteen colleges is its own residential community with student rooms, a dining hall, a courtyard, and other facilities. (Fourteen is not a significant number; it’s merely random that that is the number of schools and colleges at Yale.) Each residential college will have a “Rector” or a “Head”, who is responsible for your residential life, as well as numerous other staff. Residential colleges truly are like smaller families or communities within the larger college, which itself is a smaller part of the overall university. (That’s a lot of uses of the word “college”—I hope it’s all clear!)

In New Zealand and Australia, as well as many other countries, we have a certain disdain for “arts degrees”. The implication is that students who do an arts degree either weren’t smart enough to get into a program like law or medicine, or simply wanted an easy time at university. I got a great deal of flack from friends and friends’ parents when I was applying to US universities when they heard I would be doing an arts degree—they assumed I was giving up and wanted to party at university, as is often the stereotype here in New Zealand. But in the United States, this couldn’t be further from the reality, since all undergraduate programs are what we call “arts degrees”. The American model of higher education is entirely built around the arts degree, and it is virtually impossible to study anything other than an arts degree for your undergraduate education. Even doing a BSc for your undergraduate years will require you to have taken many courses in the humanities and social sciences—the BSc is essentially an indication that you majored in a science subject at college, and intend to pursue some kind of science-related study for your postgraduate degree.

This emphasis on the arts in the American system, and the impossibility of doing any professional or vocational study for undergraduate education, gets to the heart of what makes higher education in the US truly unique. It all comes down to “liberal education”, or the “liberal arts degree”. The term is widely misunderstood, even in the United States, and yet it’s critical to understanding US colleges and to determining whether study in the US might be for you.

Here’s how I explain liberal education: whereas university study in Australia and New Zealand is concerned with learning how to do things—how to be a lawyer, or a doctor, or a businessperson, and so on—undergraduate college education in the United States is intended to be about learning what you should do. 

Let’s delve into this a bit more. Undergraduate liberal arts colleges in the US—and those based on the US model, like Yale-NUS College in Singapore, where I study—often sell themselves based on the breadth of education you’ll receive. The idea is that whereas in Australia or NZ we immediately do highly specialised professional degrees and learn little outside that subject area, in the US undergraduate education is about broadening one’s mind, trying a whole range of subjects, and essentially having four years to explore intellectually before committing to a “vocation” which you will study at the postgraduate level. You’ll choose a “major”, which is the area you think you’re most interested in, but often there is a “common curriculum” or “distribution requirements” that forces you to take a range of subjects and classes outside that area. At Yale-NUS, for instance, one’s major (mine is Philosophy, Politics and Economics, for instance) is only 30% of all the courses that you take during the four years.

Breadth is an important characteristic of the liberal arts, but it is not the defining characteristic. What breadth achieves—and the reason all liberal arts colleges offer it—is that the point of your undergraduate education is to figure out what you want in life. Liberal education is about having four years to learn not about how to do things, though that may indeed be a part of your education, but instead to explore so that you can work out what you should want to do. Instead of asking during your undergraduate years, “How do I be a doctor, or a lawyer, or a journalist?”, liberal education is about asking “Should I want to be a doctor? Or should I be a writer instead?” It’s about using classes, and professors, and books, and mealtimes every day in the dining hall with friends and teachers, to learn about yourself for four years before committing to a profession or a vocation. With those four years of exploration, you should then be much more confident in the decisions you make about what to do with the rest of your life. And, of course, you then specialise to become a lawyer or a doctor during your postgraduate study.

Another way of explaining it is that university in Australia or New Zealand trains you to be a specialist in a certain subject area in which you’ll work for life; college in the US educates you on what it means to be human, so you’re more sure of what you should later train to be.

There are, of course, advantages and disadvantages to each system. In the US you’re more likely to become a well-rounded human being with wide-ranging knowledge and interests, and you’re more likely to be confident and sure of what vocation you choose to commit to by the time you think about postgraduate education and beyond. The downside is that you spend an extra four years doing this, whereas in Australia and New Zealand (as well as South Africa, India, and really anywhere that has developed its education system from the British model) you would spend that time specialising and then beginning your career earlier. You can therefore find yourself further behind in a career than those who had studied the vocation for their undergraduate study. A college education, then, is a luxury; not everyone can afford it, and we should remember that this kind of choice in education is a huge privilege.

Which system better suits you is therefore an individual choice, though I personally am a strong proponent of taking the extra time to learn about ourselves and the rest of our lives that comes from the American system. I think it encourages people to discover what truly matters to them—what kind of interests and work you’re willing to devote your life to. Without being very sure of the decision you make in the Australian and New Zealand education systems, you might find yourself waking up in your mid-twenties realising that the degree you’ve spent five or more years training for is in fact not what you want to do with the rest of your life. That’s a costly and disappointing realisation to have. Studying at an American college, by contrast, will give you a foundational understanding of yourself—will have (if it lives up to its ideal) helped you answer the questions of what you should do with your life, rather than how to do it—that will help you be a good person, whatever you later go on to do. Of course, you can have this kind of education for yourself in an Australian or New Zealand university by structuring your own education: you can do a BA(Hons) as you would in the United States. This is a great option, but you will need a degree of self-motivation and determination that you might not need at college in the US; here, you’ll face the stigma attached to “arts degrees” and won’t have the encouragement to explore intellectually that you would at a US college.

I’ll leave it there, and further note only that the picture I’ve painted of US colleges is a kind of ideal type. The degree to which colleges live up to that ideal will depend on the institution, and even down to the classes you take and professors you get—but nonetheless the fundamentals stand. That’s a brief primer on colleges vs universities, and what truly makes the American “liberal arts college” unique. Feel free to leave a comment or contact me if you have other questions, and I can always delve into specifics on other college-related questions in another blog post.

To end, here’s some additional reading I strongly recommend to anyone who is interested in the US higher education system and the difference between education and training:

Why Teach?, by Mark Edmundson

The Voice of Liberal Learning, by Michael Oakeshott

College: What it Was, Is and Should Be, by Andrew Delbanco

In Defense of a Liberal Education, by Fareed Zakaria

Explaining the Value of Liberal Arts Education in New Zealand

An article in Wellington’s Dominion Post today describes how an “unpredictable labour market makes arts degree more relevant.” The gist of the article, by Richard Shaw, a professor and director of Massey University’s BA program, is that the workplace of the future will require more arts degree graduates. As the speed of technological change increases, technical jobs are becoming computerised, and entirely new jobs are being created. The workforce therefore needs graduates with “the capacities to think critically, communicate clearly, and cope with cultural diversity”, those skills that an arts degree teaches.

The argument is the one that arts and humanities programs the world over have been using over the past decade as the call for technical specialisation has seen graduation numbers decline. Arts programs have found themselves needing to justify their existence on the same terms as technical programs, which speak from ideas of productivity, employability, and ‘usefulness’. Specialised university degrees boast of higher employment rates of graduates, higher salaries, and moreover make the assertion that they are more practically useful to economies and societies. But by attempting to counter those claims, arts programs have merely subordinated the arts and humanities to the values of science and technology—values that the arts and humanities always stood as a counterbalance to.

I should say up front that I entirely agree all these arguments that defend the arts and humanities on terms of employability and usefulness. Arts degrees are the best foundation for anyone entering a world in which the meaning of work and technical skill changes annually. But while agreeing with the argument, I also think it is counterproductive; that by subordinating arts degrees to the terms of value set out by technical programs, we lose the essential values—and, yes, usefulness—of the arts and humanities. Simultaneously, we make it less likely that those students who study the arts and humanities will actually receive that kind of education; they will seek in it instead the kind of practical usefulness of technical programs, and look past what the arts and humanities truly offer.

Shaw fell into the trap when he says in his second paragraph, “Let’s put aside, for the purposes of this argument, all of those socially desirable things that a BA can impart: knowledge of self and curiosity regarding the world, the capacity to listen as well as to mount a cogent argument, and the ability to ask awkward questions of those in positions of power.” If we set those aside, we set aside the essence of an arts education. We set those aside, and then the only argument left is an attempt at saying, no, arts degrees are better for your job prospects. And if I were a prospective arts student struggling to justify that path against those who told me to be practical, to be realistic and think about a job, I’m not sure I’d listen to Shaw on blind faith that employers would leap at the chance of having me after graduation. And even if I did trust that, I would then be taking an arts degree for practical, prudential reasons—looking daily during my time at university for chances to improve a CV, taking classes and reading books for how they might put me ahead of others in the hunt for jobs. In doing that, I’d then have missed what an arts degree can offer that nothing else can—precisely those qualities that Shaw lists and then dismisses.

The real challenge for proponents of the arts and humanities—what a different tradition calls ‘liberal’ education—is to define its value on its own terms, and to resist the easy option of merely throwing statistics back at technical programs. Doing that makes for a neat op-ed, but does not help with the harder task of persuading students and society of the essential value of liberal education on its own terms.

In the United States this debate over arts degrees and technical training is much further developed, likely because the BA degree is the norm for American undergraduates. In the US, and in a range of other countries following the US system (including at my university, Yale-NUS College in Singapore), undergraduates complete a four-year BA degree, and then follow it by specialised training in postgraduate study. There, the debate is not so much on whether students should undertake BA degrees or other degrees, but rather what a BA should encompass—whether students should major in humanities subjects, or the sciences and social sciences for employability, within their BA.

As a result, most US colleges and universities take a broad approach to encourage students to study arts degrees, or the “liberal arts” as it is known. There is a focus on the intangible but very real benefits of a liberal education, captured in a slogan like “Four years to transform your life”, through to the same kinds of statistics advanced by Shaw in the New Zealand context. At the very least there is the recognition that the arts and humanities bring value of a different kind to the focus on statistics and productivity of other disciplines—and that those values are ones students should feel proud, rather than worried and concerned, to pursue.

Judging from this debate over the usefulness of liberal education in other countries, ours in New Zealand is just getting started. We should ensure that arguments made in favour of the arts and humanities demonstrate and advance the values that those disciplines bring, and not append them as garnish to the values of specialist university degrees.

A Future Without Personal History

Note: In 2011 I wrote this article for ReadWrite, a widely-read blog covering the technology industry, on what would happen if we didn’t make an effort to store our communication history. I lamented how older generations could look back through letters, physical records of their lives with one another, and yet we would seemingly be left with nothing. The article inspired an impassioned response from journalist Paul Carr at the blog TechCrunch and a lively online debate. I ultimately ended up founding a company based on the premise I wrote about.

I thought I’d re-share the article as I rediscovered it. I was sixteen at the time—things have certainly changed, and you’ll have to excuse the writing. And, irony of ironies, I now rather enjoy writing letters.


Remember those pieces of paper with handwritten words on them that you used to post to people? “Letters” I think they’re called. To be honest though, I wouldn’t have a clue, as I’ve neither sent nor received one in my 16-year-old life.

I’m sure the majority of readers here have at least sent a personal letter to friends or family in their lifetime. However, the same cannot be said about my generation. I’ve sent tens of thousands of emails, Facebook messages, SMSs, and IMs – but never a single letter.

More than solely being a form of communication, letters are a very effective historical item. Think about letters sent home to families from the soldiers on the battlefields of both world wars. Letters were kept because they have a perceived value – it took time and effort to send a letter, and therefore people viewed them as much more valuable.

My parents still have letters that they received more than 30 years ago, and when they read them now they say that they detail entire relationships and friendships. They have vast amounts of information about their own history stored inside the letters that they sent and received. It goes even further than that. My grandmother still has letters she received from her grandmother. If it weren’t for those letters, all that information about my own family history would have been lost, or confined to memory (which, as my parents are discovering, fails us all eventually).

And yet, I can’t tell anyone what I was discussing with someone a month ago. That’s testament to the digital age that I, and everyone in my generation, is a native member of. I find myself feeling incredibly guilty that my parents and grandparents went to so much effort to ensure that our family history was kept, and here I am frequently losing information about my life.

The frequency and brevity of messages sent today combined with the numerous mediums used means that this personal information now has a much lower perceived value: Your email storage fills up – you delete all your messages. You get a new mobile phone – all of your SMS’s are lost.

Some people are already worrying about what may happen if we continue to throw away our information. For example, the U.S. Library of Congress announced in April last year that it would be archiving every Twitter message ever sent. Sure it’s a phenomenal undertaking, but in no way is it enough. Think about all the different mediums of communication you use.

For example, today alone I have communicated with people via SMS, email, Facebook messages, Facebook chat, Whatsapp Messenger, Skype chat, and Twitter. Out of those, only my public Twitter updates are being stored. There are other efforts like the Library of Congress’ undertaking, but mass archiving won’t help us store our individual histories in a way that we can access.

What happens if, in three years, I want to go back through all my communications with my girlfriend? I may not be using an iPhone in three years, so all of my messages on Whatsapp Messenger will be gone. I definitely won’t be using the same mobile phone, so all of my SMS’s will be gone. My Gmail storage will have filled up, so I won’t have any of our emails any more. I doubt I’ll even still be using Facebook – there’s all of that communication gone.

All of this information that is so important and so relevant to me personally is just disappearing, and I won’t be able to track the relationships and friendships that I have had.

Personally, I am now backing up my computer daily, and copying and pasting communication from all different formats into different documents stored both on hard drive and in the cloud. While it’s a start, it’s an absolutely horrific task, and doesn’t completely work (I’m not going to be transcribing my SMS’s into a document).

The abundance of technology is severely devaluing information. Do we go on ignoring this fact, and losing the details of our lives? Or do we do the hard work, and attempt to effectively store our communications? I know that I’ll be putting in the hard work – at least until the magicians in Silicon Valley come up with a better solution.

Thoughts on New Zealand’s School Decile Funding System

New Zealand’s school decile funding system has hit the news again, with Deputy Prime Minister and Finance Minister Bill English making public his opposition during a visit to Taita School outside of Wellington.

The idea of decile funding is sound. It is an attempt to take a proxy for the average level of socioeconomic need in a given school, and then to target additional school funding above the baseline to those schools with greatest need. It is, at its most fundamental, a recognition of the fact that the hardships of socioeconomic deprivation can affect the educational opportunities of students, and that providing equality of opportunity requires a concerted effort to counter the effects of deprivation.

Targeted Funding for Educational Achievement (TFEA) is the primary means by which additional funding is provided to students. For 2016, for instance, a Decile 1A school (the lowest on the decile measure, indicating severe socioeconomic need) will receive an additional $915 of funding per student above the baseline funding that all schools receive per student.

Again, I believe the decile system and TFEA are sound ideas to counter one of the most critical problems a country can face, and to ensure that everyone has an equal opportunity to make of their life what they wish. And yet, when one delves into how they work in practice, it becomes clear that a good idea does not necessarily solve the problem.

1. Only 45% of students of low socioeconomic status (SES) students attend a low decile school.

This is a critical failure of the use of a proxy to make an assumption about all students in a given school. Deciles are calculated using five socioeconomic measures of the geographic area in which a school is located. But within that area, there will clearly be disparities—some students will be severely deprived, while others may in fact not have great need.

Furthermore, the decile funding system does not correlate to school zoning, meaning that students outside of the area used to calculate a school’s decile may attend a low decile school.

And yet the decile system and TFEA treat all students within a school as of the same decile. The statistics show us that the failure of this proxy is stark: over half of students in a school receiving maximum TFEA do not in fact have the lowest level of socioeconomic need. This also means that there are 55% of students with severe deprivation who attend schools receiving less than maximum TFEA.

Deciles target schools as a whole, but students have their own lives and their own stories. Proxies are necessary tools of policy; but the decile proxy is one that is not working.

2. A decile includes 10% of schools, but only 6.8% of students in New Zealand attend a decile 1 school.

Deciles count the number of schools, but schools do not all take the same number of students. Indeed, higher decile schools have higher numbers of students than low decile school. This means that, as above, fewer students receive the TFEA that they should.

To put this another way, Targeted Funding for Education Achievement should in theory reach 90% of students (all students aside from those in decile 10 schools), and yet in practice it reaches 84%. This is another sign of the failure of the proxy to get resources where they need to be.
Those are two data points that to my mind are all that are necessary to show why deciles aren’t working in practice. And yet, of course, part of the bigger debate over deciles has been the stigmatisation of students because of the decile of the school they attend. It is the sad irony that decile funding doesn’t target individual students, and so much of this funding does not reach the students it’s meant for; and yet the stigmatisation of a decile is very much attached to individual students. This stigmatisation can be as harmful as socioeconomic need itself.

The decile system is at once too transparent in the message it sends of students’ backgrounds, as well as too opaque to work correctly. There are better proxies that could be used, especially with the kinds of data collection possible today. But whatever system is designed, it needs to ensure accurate targeting of funding, and it needs to do so without any stigmatisation being attached to schools or students. Opacity is not necessarily a bad thing in this context; and nor could be eschewing labels entirely, instead simply directing actual resources (especially the best teachers) to schools with the highest levels of need.

This is one of the most important problems, and the decile system has been a serious attempt to solve it. But it’s time to try something new.

The Future of Social Networks

Note: I wrote this article in 2011, looking at how social networks could more accurately mimic real life societies. It ended up being the single most-read and most-commented-on piece on my blog. I was sixteen at the time, so excuse the writing. Interesting to see both how the numbers have changed since 2011 (600 million users! One and a half years!), as well as how Facebook has and has not moved closer to the vision I outlined.


So Facebook has 600 million users. Many people are saying that Facebook will now be here for ever, and the entire planet will eventually be on Facebook. The same people are saying it will grow to be the biggest company in history, and that it’ll make a killing for investors. I disagree. This article explains why I disagree, and discusses what social networks should look like to succeed.

Social networks are still in early days. I don’t think they’ve really matured in any way, because they are still built on false assumptions that were made beginning with the first few mainstream social networks. The system of “friending” is completely broken, and yet many people don’t realize it because they don’t stop to ask why it is that way.

Facebook says that all my friends and contacts are of equal importance to me. They know this isn’t true, but there is no way for me to distinguish between friends I am truly close with or contacts that I met at a conference and felt obliged to accept on Facebook. In real life, we rank our connections in order of how important they are to us and how close we are with them. But on Facebook, this system has gone out the window because that functionality is not built into the social network.

But there is more about Facebook that is broken. Facebook is a “one-size-fits-all” social network. In other words, it thinks that everyone will find use in Facebook as long as they are on it with their friends. They believe that the higher the number of users they have, the more likely it is that people will keep joining. But this view goes against societal laws.

We live in societies in real life because we surround ourselves with people who share similar values, beliefs, and interests. Sure, the fact that I support one political party over another says that I have slightly different values to the person next to me, but fundamentally our values and beliefs are very similar. And living in a society allows me to know that anybody I meet will have fundamentally the same mindset as me. People who share similar religions live in the same societies, because they understand each other. This means that I can meet new people, and be social with a group outside of my existing close friends, with the knowledge that anybody I meet will be essentially similar to me.

Think about the term social network for a moment. When we hear it, we think of online social networks, like Facebook, with a system of “friending” and where we only communicate with our existing contacts. But social network is a broad term. Actually, it kind of describes how we relate to our contacts in real life. We have our own social network in real life, and you know what? It works. It’s called our society, and it’s been around for decades, if not centuries.

My question is: why aren’t online social networks built like physical societies?

Imagine this model as three circles, one inside of the other. The inner circle has your core group of friends and family – you share everything with them. There may only be 25 people in there, but these are the people who you would call to tell them something important that has just happened. They mean a lot to you. You’ll connect with these people by “friending” them – ie. mutual designation.

The next circle, which is quite a few times larger than the inner circle, is made up of your connections. These are the people who you’ve met at conferences, or know from school – you’re not close with them, but you’d talk to them if you saw them on the street. To connect with these people, you just have to specify them as a connection. It’s more like “following” them, only they will see that you have specified them as a connection and they can specify you back.

The third and final circle is made up of outer society. People you don’t know, but who you may meet someday. You cross paths with these people every day, but just haven’t yet taken the time to stop and talk to them. This final circle is huge – many, many times bigger than the previous two – and you have no direct link to them unless you choose to.

What this model allows is for us to differentiate between true “friends”, and mere “connections”. You can have a clear distinction between the two, allowing you to know more clearly who what you are sharing will reach. It gives you the ability to share more with those you really care about, without annoying connections. And, likewise, it allows you to share things with connections that you wouldn’t share with your family. And what about “outer society”? Well, you can interact with them as much or as little as you want.

The beauty of this model is that it allows us to choose how we want to use our social network. If we want to use it like Facebook, we can do that – the choice is entirely up to us.

But there will not be just one social network that looks like this. There will be tens, if not hundreds of them – each with millions of users. The social network that you are a part of will be a representation of who you are as a person. It will signify your values, beliefs, and interests.

When will this shift in model of social networks occur? I believe it will start in a year and a half, and reach the mainstream in about three years from now. That’s time for these new social networks to be built and perfected.

In any case, the battle of the social networks is far from over. Facebook hasn’t won, and there are plenty of genius programmers at colleges around the world. Good luck.