I am a big fan of the expression "Race against the machines". I really do think that is where the job market is these days. However, we are still in the early parts of the exponential explosion. For that reason, many people don't see this as a race against the machines. Instead, it is more of a race against other humans. That isn't simply perception, it is also largely reality. People can race against the machines for a while, but that isn't a race you can really win. The machines improve exponentially. By the time you realize you need to start running, there isn't much time left before you will be completely left in the dust. Instead, humans are racing one another for those positions that the machines can't yet do. They are also racing to positions that are currently machine free in the situations where their previous knowledge/capability falls behind in another area.
One of the other things that I have been watching a fair bit recently is the free availability of educational resources online. Just a few years ago, there really weren't all that many options when it came to learning most topics. You could buy a book or spend a lot of money looking for someone to teach you using traditional educational approaches. Some resources were available online, but they were expensive. Just between Khan Academy and the MOOC (Massive, Open, Online Course) sites (Coursera, Udacity, and edX), there is now a wealth of educational material present at a variety of levels.
So what does this have to do with free time? Well, if you spend all of your free time doing activities that don't lead to self-improvement, you are losing the race against other humans and the machines. I feel like the US society has a mentality that chases entertainment and leisure. There is no doubt that people need to have both entertainment and leisure in order to maintain good mental balance, but I worry that most of the entertainment and leisure that people strive for does nothing to improve them and that we have made entertainment the ultimate goal of our life quests. If you want to have any chance of not losing the race, you need to strive to improve yourself. More broadly, I think that we need to work on being a society that seeks for self-improvement, both physically and mentally instead of one that seeks entertainment. I mentioned some mental self-improvement options above. On the physical side, Fitocracy is one of many tools you can use to help motivate you to do exercise and in the end the goal is to turn self-improvement into something more akin to entertainment.
So who should be doing this? I would argue everyone. In my first blog post looking at the course I am taking at Coursera, I argued that MOOCs are great for continuing education for educators and many others. Students who want to go into more depth in a field or who find that they lack certain background in something should definitely consider turning to either MOOCs or resources like Khan academy.
These example groups skirt around the reality of the modern US and the fact there there is a large population of people who are currently losing the race and who really need to focus on self-improvement. That is the population of unemployed and underemployed. As it stands, the labor force in the US is below 50% of the population and dropping. That puts it back in the range it was in during the 1980s when many women still hadn't moved into the workforce.
In most segments of the economy, this means that there is very stiff competition for what positions there are available. That in turn means employers can be selective and only take the top people. If you aren't one of the top people right now, there is only one way to get there, that is through self-improvement. In the past this might have been some type of cruel catch-22 because self-improvement always came at a monetary cost, whether it was tuition for school or just the price of books you might buy. Today, those barriers are gone. Anyone can sign up for a MOOC for free and listen to lectures from some of the worlds greatest minds. If they start going over your head at some point during the semester, that's fine, you aren't contractually bound to go through the whole course and you lose nothing if you stop at the point where it becomes too difficult. Now the only cost is the cost of the time you take in doing these things.
Unfortunately, what I think is happening more often in our society is that instead of going for the challenge of self-improvement, people go for the mindless joy of Facebook games (and other similar substitutes). I can understand the temptation. I got sucked into a mindless FB game once and I have to admit that I am occasionally tempted to do such things when I feel overwhelmed. (The only reason I don't is because playing them wouldn't make me any less overwhelmed. In fact, while I was playing odds are good that more stuff would have gotten piled on top of me.) The thing is that FB games, pretty much uniformly, do not improve you or get you ahead in the race. As a result, the time that a person spends playing them causes that person to fall further behind those he/she is racing against.
Now let's be honest, I do believe that automation is causing technological unemployment, and in the long term, the nature of jobs and their role in our society needs to fundamentally change. The cynical side of me says that in the future we might actually want a large fraction of the population playing mindless FB games so that they stay happy and don't riot. However, we aren't to that point yet and we still live in a world where people need to work if they want to live a decent life. As such, everyone should be doing as much as they can to improve themselves to move ahead in the race. Everyone should take advantage of the new opportunities for education and other forms of self-improvement. They should do so first with the goal of making their own lives better. People that I know who are in the position of hiring others love seeing self-motivation and self-improvement. They will ask you in an interview what you have been doing with your down time, and if the answer isn't something they see as useful it is a major black mark on your application. However, I think there is a second reason.
When I look forward toward a world where most people don't need to work for things to run smoothly to run, I see a world where we need people to be motivated by different things. If everyone seeks mindless entertainment and leisure, I don't think think society will work well. It either falls apart completely, or we get a Wall-E type of scenario. However, if we can change our societal norms to become a society that strives for self-improvement, I think that a world where we don't need jobs works very well. People in such a society will continue to work to educate themselves. They will put effort into creative endeavors. Inevitably, they should still enjoy the occasional period of rest or activities that are purely for entertainment and leisure, but we need those activities to be what people do occasionally to recharge and to gain motivation for the next step in their life path, not to be the ultimate goal that people work towards.
Saturday, September 29, 2012
Tuesday, September 11, 2012
Proposal for a Technology Capacity in the Trinity Curriculum
This is a follow on to my previous blog post about the curricular proposal that recently came out at Trinity. On Friday 9-7-2012 there was a meeting of the faculty to discuss the proposal. It came out fairly quickly that the committee who designed the proposal acknowledged the not addressing technology was a hole in the current proposal and that it needed to be addressed. They also mentioned an addendum for a "Capacity" in technology that had been given to them shortly after the first time they had presented the proposal to another group of faculty (the University Curriculum Counsel), but they hadn't had sufficient time to integrate it, or even decide if they liked it. In this blog post I want to lay out my idea for how technology should be incorporated, which I believe mirrors the existing addendum closely. I will provide my reasoning, more details than are in that addendum, and also consider the feasibility based on faculty resources.
Why?
I want to start off with a quick recap of why I think that including proper technological proficiency in Trinity graduates is essential today and will grow in importance moving forward.
So what should our technology requirement be asking student to do? You can probably tell from what I wrote above that my real goal is to get students to the level of proficiency that they can use a computer to solve problems in fields that are relevant to them which are not easily solved by humans by hand. Most of these problems involve either numerics or data processing of a scale that simply puts them outside the reach of unaided humans, but they are things that a computer can finish almost instantaneously if given the right instructions.
I think that by aiming for this objective, these courses will implicitly give students another thing that I feel is absolutely essential, a sufficient comfort level in working with technology. Comfort in working with technology is essential for so many aspects of modern life, and as computing power becomes more ubiquitous, that is only going to grow. However, I don't think this is what courses should focus on. This should instead be something that falls out of the mix when we force students to use technology to solve other problems.
For me, the ideal way to do this involves programming. It doesn't have to be serious programming, but it needs to involve putting together logic using the basic logic structures that have been part of the programmers toolkit for decades. I would argue that learning how to program changes the way a person views every problem they encounter in a way that is far more fundamental than learning a foreign language. When you program, you have to really break a problem down and figure out how to express that problem in terms that a computer can understand. So programming is, in many ways, a translation problem. You translate from normal human terms to the more formal terms and syntax of a programming language.
While I think that the programming part is critical, the way in which it is done is far less important to me and should be selected to agree with the problem area. At the faculty meeting to discuss this, someone made a negative comment about courses teaching Excel. If all a course taught was basic Excel, I would agree. However, there is a lot to Excel that goes beyond the basics. Since the goal is to focus on using technology to solve problems, and the problems should be of sufficient complexity that basic Excel won't do it, I would be perfectly happy with a course that uses Excel and has students write VB Script to implement more complex logic. Indeed, if the data sets that are associated with that course/topic tend to be tables of data, they probably come in either Excel or CSV format anyway, and then Excel isn't just a suitable choice, it probably becomes the ideal choice. (Other spreadsheets would work too. For example, the spreadsheet in Google Docs also has a scripting environment.)
The reality is that tools, whether they be Excel or something else, change over time. That is part of the nature of technology. That is also why courses should not focus just on tools. If a student takes a course in his/her first year and that course focuses only on tool usage, it is possible that tool won't even be available or supported by graduation. However, whatever other tools will inevitably be used to solve those problems will inevitably use the basic knowledge of programming/scripting. So this skill/knowledge translates well across pretty much all tools because the nature of programming has shared elements across all languages. In a sense, there are certain constructs that are used to describe algorithms, just as things like past and present tense exist across all natural languages. By focusing on problem solving and forcing students into more challenging problems that require going beyond basic tool usage, we get to the logic elements that persist across time even as tools and technology change under them.
How?
So how do we make this happen in the curriculum? For me, the main point is that the majority of students do this in a course in a department other than Computer Science. The key is that the computation should have purpose and context. There should be problems associated with a particular subject or line of study. When students take a course in CSCI at Trinity, we can throw some problems at then and give it some context, but everyone in the room has different interests and in the Computer Science department, we are often interested in the nature of computation itself more than the problems it can be used to solve. (This is much like the difference between pure and applied mathematics. Almost no one outside of mathematics cares about pure math until some application is found for it to help them solve a problem.)
So these would be courses taught in other departments and to get approval for satisfying this capacity, they would have to demonstrate how solving problems through technology fits into the syllabus. Some of these certainly exist. Certainly CSCI courses would qualify, but I think there are probably quite a few others around campus in departments like Communications and Urban Studies as well as some upper level STEM which also do this without modification. More will be needed though. I think many of these courses could be easily created from existing courses that have been enhanced with assignments/projects that wouldn't have been possible without the technology addition. For existing courses that already use technology for problem solving, they could work with their current hour allotments. For courses that need this added on, I would not want to see the computing elements cut into their normal content. Instead, I would rather see an extra hour added for that purpose. That extra hour would include the time where students learn how to use the technology for problem solving as well as where they will find whatever information (such as data sets) to use in the process. So a lot of the courses that satisfy this would go up to being 4-hour courses under the current system. It might also be possible to have the technology be an add-on lab that only some students would take. That might not work well in many cases, but allowing it as an option would be very helpful for those situations where it does work.
The situation where additional problems are added to a class that involves using technology to solve them is where resources really come into play with this proposal. If Trinity can't actually enact and sustain a proposal, then it doesn't matter whether or not it is any good. Clearly, the courses that already satisfy the requirement require no new resources. However, that will likely be a small fraction of the total. Most of the seats for this requirement would need to come from courses that are augmented with computation and the faculty teaching those might well need some assistance to do that.
How many seats are needed? I personally think that students would benefit most from having to take 2-3 courses that fulfill this requirement. The hope is that students would see slightly different approaches. That helps them to abstract the ideas and see how to apply them more broadly. For every course that is required, we need ~650 seats/year. Courses are typically 20-30 students so it is reasonable to say that we need about 30 sections of these courses each year for every course that is required. That means anywhere from 15-45 courses/semester to have 1-3 of these in the graduation requirements.
Is this doable? I think so and I will go into detail below. First though, I can already see some people objecting that there is no reason there should be 3 computing/technology courses required. However, I would remind anyone making that objection that these aren't computing/technology courses. These are courses in subjects from around the campus which include a significant assignment or project which highlights using technology to solve a problem in that field. There are over 20 departments on campus so even requiring three of these courses for graduation only implies that each department offer ~2 such courses per semester.
(If my department chair sees this blog he should probably stop reading at this point.)
Where things get harder when it comes to resources is the fact that not all faculty will feel comfortable putting this type of content into their class. Even faculty who might be able to find great data sets online, and who want to have their students process/mine those data sets for interesting information might not feel comfortable with the responsibility of giving the students the capability to do that type of technology based problem solving. I don't think they should have to do it alone. I can't volunteer CLT to help with this because they have other tasks. However, the CS department and the instructors housed in it who currently teach IT skills could likely provide some support for this.
Currently the CS department teaches ~3 sections of CSCI 1311 each semester and the IT Skills instructors in the department teach ~7 sections of 1300. That is 30 contact hours per semester currently devoted to the CC. Some of those sections would probably be kept untouched, but in a very real sense that is enough human time to assist with one hour of credit for up to 30 courses each semester. In addition, early efforts to do things like prepare video lectures that cover this material could make it possible to get students up to speed with the skills that they need to solve the problems in question with less direct involvement from faculty in that aspect of the course.
In summary, the reality of the modern world is that computers run everything and students need to have some knowledge about how the software that runs those computers works. They also need to know how to make the computers solve problems for them in situations where the computer can do it better than a human. This should be done in the context of topics the students are studying for other reasons, not just because we are twisting their arms to code. We have the resources to make this happen. It just takes a little will in the part of the faculty. The result will be much stronger students who are more ready to enter the world of 2022.
Why?
I want to start off with a quick recap of why I think that including proper technological proficiency in Trinity graduates is essential today and will grow in importance moving forward.
- The world is run by computers and technology. Digital processors running software control your financial transactions, your utilities, your car, and pretty much everything else you use on a daily basis.
- This is an area where primary and secondary schooling utterly fails. For example, students in Texas are required to take 4 years of English, Math, Science, and Social Studies. For foreign language it is a 2-year requirement and the higher level graduation plan needs three years. One can debate how effective their efforts are, but there is some foundation in all of these areas. On the other hand, Texas students are required to take NOTHING related to understanding and utilizing technology. (In fact, the way it is included in the curriculum discourages the types of students who would attend Trinity from taking it.) In a way, this makes technology education at the college level something of a remedial subject. However, it is clearly important to the future world and we have to make sure our students aren't completely ignorant at graduation.
- Computers are tools that can solve certain problems that humans are very poor at. This isn't really about technology. This is about problem solving, and having enough knowledge to be able to identify, and hopefully use, the proper tool for solving various problems. With the growth in data sets, especially publicly available data sets, finding correct answer to more and more problems is becoming something that computers are really good at it and humans aren't.
So what should our technology requirement be asking student to do? You can probably tell from what I wrote above that my real goal is to get students to the level of proficiency that they can use a computer to solve problems in fields that are relevant to them which are not easily solved by humans by hand. Most of these problems involve either numerics or data processing of a scale that simply puts them outside the reach of unaided humans, but they are things that a computer can finish almost instantaneously if given the right instructions.
I think that by aiming for this objective, these courses will implicitly give students another thing that I feel is absolutely essential, a sufficient comfort level in working with technology. Comfort in working with technology is essential for so many aspects of modern life, and as computing power becomes more ubiquitous, that is only going to grow. However, I don't think this is what courses should focus on. This should instead be something that falls out of the mix when we force students to use technology to solve other problems.
For me, the ideal way to do this involves programming. It doesn't have to be serious programming, but it needs to involve putting together logic using the basic logic structures that have been part of the programmers toolkit for decades. I would argue that learning how to program changes the way a person views every problem they encounter in a way that is far more fundamental than learning a foreign language. When you program, you have to really break a problem down and figure out how to express that problem in terms that a computer can understand. So programming is, in many ways, a translation problem. You translate from normal human terms to the more formal terms and syntax of a programming language.
While I think that the programming part is critical, the way in which it is done is far less important to me and should be selected to agree with the problem area. At the faculty meeting to discuss this, someone made a negative comment about courses teaching Excel. If all a course taught was basic Excel, I would agree. However, there is a lot to Excel that goes beyond the basics. Since the goal is to focus on using technology to solve problems, and the problems should be of sufficient complexity that basic Excel won't do it, I would be perfectly happy with a course that uses Excel and has students write VB Script to implement more complex logic. Indeed, if the data sets that are associated with that course/topic tend to be tables of data, they probably come in either Excel or CSV format anyway, and then Excel isn't just a suitable choice, it probably becomes the ideal choice. (Other spreadsheets would work too. For example, the spreadsheet in Google Docs also has a scripting environment.)
The reality is that tools, whether they be Excel or something else, change over time. That is part of the nature of technology. That is also why courses should not focus just on tools. If a student takes a course in his/her first year and that course focuses only on tool usage, it is possible that tool won't even be available or supported by graduation. However, whatever other tools will inevitably be used to solve those problems will inevitably use the basic knowledge of programming/scripting. So this skill/knowledge translates well across pretty much all tools because the nature of programming has shared elements across all languages. In a sense, there are certain constructs that are used to describe algorithms, just as things like past and present tense exist across all natural languages. By focusing on problem solving and forcing students into more challenging problems that require going beyond basic tool usage, we get to the logic elements that persist across time even as tools and technology change under them.
How?
So how do we make this happen in the curriculum? For me, the main point is that the majority of students do this in a course in a department other than Computer Science. The key is that the computation should have purpose and context. There should be problems associated with a particular subject or line of study. When students take a course in CSCI at Trinity, we can throw some problems at then and give it some context, but everyone in the room has different interests and in the Computer Science department, we are often interested in the nature of computation itself more than the problems it can be used to solve. (This is much like the difference between pure and applied mathematics. Almost no one outside of mathematics cares about pure math until some application is found for it to help them solve a problem.)
So these would be courses taught in other departments and to get approval for satisfying this capacity, they would have to demonstrate how solving problems through technology fits into the syllabus. Some of these certainly exist. Certainly CSCI courses would qualify, but I think there are probably quite a few others around campus in departments like Communications and Urban Studies as well as some upper level STEM which also do this without modification. More will be needed though. I think many of these courses could be easily created from existing courses that have been enhanced with assignments/projects that wouldn't have been possible without the technology addition. For existing courses that already use technology for problem solving, they could work with their current hour allotments. For courses that need this added on, I would not want to see the computing elements cut into their normal content. Instead, I would rather see an extra hour added for that purpose. That extra hour would include the time where students learn how to use the technology for problem solving as well as where they will find whatever information (such as data sets) to use in the process. So a lot of the courses that satisfy this would go up to being 4-hour courses under the current system. It might also be possible to have the technology be an add-on lab that only some students would take. That might not work well in many cases, but allowing it as an option would be very helpful for those situations where it does work.
The situation where additional problems are added to a class that involves using technology to solve them is where resources really come into play with this proposal. If Trinity can't actually enact and sustain a proposal, then it doesn't matter whether or not it is any good. Clearly, the courses that already satisfy the requirement require no new resources. However, that will likely be a small fraction of the total. Most of the seats for this requirement would need to come from courses that are augmented with computation and the faculty teaching those might well need some assistance to do that.
How many seats are needed? I personally think that students would benefit most from having to take 2-3 courses that fulfill this requirement. The hope is that students would see slightly different approaches. That helps them to abstract the ideas and see how to apply them more broadly. For every course that is required, we need ~650 seats/year. Courses are typically 20-30 students so it is reasonable to say that we need about 30 sections of these courses each year for every course that is required. That means anywhere from 15-45 courses/semester to have 1-3 of these in the graduation requirements.
Is this doable? I think so and I will go into detail below. First though, I can already see some people objecting that there is no reason there should be 3 computing/technology courses required. However, I would remind anyone making that objection that these aren't computing/technology courses. These are courses in subjects from around the campus which include a significant assignment or project which highlights using technology to solve a problem in that field. There are over 20 departments on campus so even requiring three of these courses for graduation only implies that each department offer ~2 such courses per semester.
(If my department chair sees this blog he should probably stop reading at this point.)
Where things get harder when it comes to resources is the fact that not all faculty will feel comfortable putting this type of content into their class. Even faculty who might be able to find great data sets online, and who want to have their students process/mine those data sets for interesting information might not feel comfortable with the responsibility of giving the students the capability to do that type of technology based problem solving. I don't think they should have to do it alone. I can't volunteer CLT to help with this because they have other tasks. However, the CS department and the instructors housed in it who currently teach IT skills could likely provide some support for this.
Currently the CS department teaches ~3 sections of CSCI 1311 each semester and the IT Skills instructors in the department teach ~7 sections of 1300. That is 30 contact hours per semester currently devoted to the CC. Some of those sections would probably be kept untouched, but in a very real sense that is enough human time to assist with one hour of credit for up to 30 courses each semester. In addition, early efforts to do things like prepare video lectures that cover this material could make it possible to get students up to speed with the skills that they need to solve the problems in question with less direct involvement from faculty in that aspect of the course.
In summary, the reality of the modern world is that computers run everything and students need to have some knowledge about how the software that runs those computers works. They also need to know how to make the computers solve problems for them in situations where the computer can do it better than a human. This should be done in the context of topics the students are studying for other reasons, not just because we are twisting their arms to code. We have the resources to make this happen. It just takes a little will in the part of the faculty. The result will be much stronger students who are more ready to enter the world of 2022.
Wednesday, September 5, 2012
A Curriculum for 1992
(Update: The faculty met to discuss this on Friday afternoon, 9/7, and the committee said that they did feel this was a hole in the curriculum and they needed more time with the proposal to fix it. I will keep my fingers crossed. My next post will basically elaborate on my proposal and the staffing requirements for it.)
Trinity has been working on revising the curriculum for nearly a year now, and today a formal draft of that curriculum was sent out to the faculty. As you can tell from the subject of this post, I am not impressed. I'm writing this post not only to express my ideas, and inevitably some frustration, but hopefully to motivate alumni to take a little action in regards to this. Keep reading for details.
Computers Are Just a Fad
At least that is the impression I get from the curriculum document. The charge for the revisions was to create a curriculum for the 21st century. However, the only mention of technology in the entire document comes in the FAQ at the end. Here is what they say:
Globalization: The Game Changer of the Last Several Decades
So what does this curriculum do instead? There are changes in it. One of the big ones is a push to address globalization. This includes a "capacity" with courses on "Global Awareness", "Understanding Diversity", and "Foreign Language". This is on top of the standard elements where you have to be able to read, write, and speak as well as a smattering of courses from humanities, social sciences, natural sciences, and math. The new part is dealing with being an "engaged citizen", which seems to be largely motivated by a desire to have Trinity students prepared for globalization.
In my opinion, globalization is yesterday's news. I made the subject of this refer to 1992 because honestly, a really forward looking curriculum would have included globalization back then. Now this is just a knee-jerk reaction to a boat that was missed two decades ago. Globalization was perhaps the biggest influence on our economy and the general evolution of the world over the few decades up to 2010. However, it isn't what is going to shape the future. Yes, global exchange of information is going to continue to be important, but production of goods is on the verge of heading back the other way. New approaches to digital manufacturing, including 3-D printing and increase automation, are making it possible to put the production of goods back close to the point of consumption. After all, why should we ship materials to China, have them assembled there, and ship them back if they can be assembled here? For the past few decades the answer was that assembling them here cost too much. However, today even Chinese companies like Foxconn are planning to replace their human workers with robots. Those robots aren't any cheaper to run in China than they are here. However, energy costs for transportation are only going up. So at a certain point (which I expect is <10 years from now) you cross a point where you want to put the robots close to the end consumer and make things as close as possible to where they will go in the end.
In addition, technology is significantly minimizing the need to actually speak a foreign language to have a dialog with someone who doesn't speak your language. Today Google Translate can allow me to have a reasonably fluid conversation with someone who speaks a different language, and the quality of translation and speech understanding is improving by leaps and bounds. If you have seen comparisons between Google Now! and Siri, you can see what one extra year of development means in this space. I fully expect that by 2022 I will be able to speak to someone in almost any language in a manner that is very close to natural without knowing that language. This isn't to say that there aren't cognitive benefits to learning a foreign, natural language. It is just to say that interpersonal communication is going to cease to be one of those benefits.
If Not Globalization, Then What?
So what do I think is the game changer of the coming decades? What should our new curriculum aim for? The paragraphs above should make this fairly clear. Globalization is going to take a back seat to the oncoming surge of digital technologies that will be enabled by machine learning based AIs and automation. It is impossible to predict exactly what will be relevant, but based on what is already out there you can feel pretty confident that in 2022 most students will have cars that drive themselves and many of them will have robots at home that cook and clean. (Sound Sci-Fi? Then you need to follow me on Google+ or at least go search for videos of those things on YouTube because they are feasible today and will be cheap enough to be wide spread in a decade.)
There are other things that are becoming increasingly significant as well. The buzzword of "big data" is everywhere for a reason. In addition, the rollout of IPv6 wasn't much hyped, but there are rumblings of the beginning of the internet of things if you look in the right places to hear them. When your shirt has an IP address and is constantly sending information about your temperature and heart rate into the cloud for analysis, then you will begin to understand what these things are. They are primed to change the way we live in dramatic ways.
What does this mean for the curriculum? My take is that if a graduate of 2022 looks at a computer and seeing a magic black box with pretty pictures on it, that graduate has already lost at life. They are a powerless consumer with no ability to produce in the markets that will define their time. If we let them become that, we have failed the trust that they put in us when they enroll in our school.
My Proposal
So what do we do about this? What do I think the curriculum document should have included? First, let me tell you what it should not have included. It should not require that every student take a CS course specifically aimed at teaching students to program. That would be a nightmare for me on many different levels. In addition, it wouldn't really benefit the students. Some students need to know how to really code. Those students can learn about programming language fundamentals without associated context. For the vast majority of students though, they need to learn how to use computers to solve problems with at least slightly more competence than just using pre-written software.
Increasingly, data is what drives the world. Humans are horrible at manipulating even reasonable amounts of data. Computers are great at it. The graduate of 2022 should have seen the data associated with courses in a number of different departments and they should have had to do something beyond just plugging it into existing software to dig for meaning or answer questions based on that data. They need to have some experience using a computer and associated technologies to solve problems. That is what really matters. They need the skills to turn the computer into a tool that they can use to solve problems that are beyond what they can do alone.
I believe the best way to do this is to require that students take a few courses that require them to do computer based problem solving. The ideal situation would be that courses that normally count for 3 hours in departments all across the University add an extra hour of credit and a computational problem solving component. For example, a course in Political Science could ask students to analyze census data or data from the last presidential election. Have the students answer questions that aren't simple queries in Excel. That way they might learn how to write VB Script and do a little logic to solve the problems. Or maybe the questions you want to answer are well suited to some reasonable SQL queries. Sometimes the right approach might be writing scripts in Python, Perl, and Scala. The details don't matter to me. The details should be chosen to fit the data set and the questions being asked about it. What matters is that students learn how to make technology do what they want it to instead of acting as passive consumers of software that someone else has written.
I've always liked the expression that if your only tool is a hammer, every problem looks like a nail. All too often, I see people do things the hard way because they don't know there is an easy way. Even worse, I see people who can't even conceive of certain questions because the tools they know don't allow them to answer those questions. If our graduates fall into either of those categories, we have failed them. I don't want to see that happen. A major curricular review is the time to make sure we do things right. Unfortunately, I don't think the current proposal is doing that.
Call to Action
So I want to close with a little call to action for any Trinity alumni out there who are reading this. If having the ability to control technology and make it do what you want to solve problems has benefited you in life, let your old faculty know. Take a minute to tell them how skills like programming have benefited you in your life and what things you wouldn't be able to do without those skills. You might even just pass on a link to this if you don't want to write much yourself. In addition, forward this to other alumni so that they too might help the faculty at Trinity to see that computers are not a fad that is going away and that being able to bend and manipulate technology to your benefit to solve problems really is a valuable skill that everyone needs to have.
Trinity has been working on revising the curriculum for nearly a year now, and today a formal draft of that curriculum was sent out to the faculty. As you can tell from the subject of this post, I am not impressed. I'm writing this post not only to express my ideas, and inevitably some frustration, but hopefully to motivate alumni to take a little action in regards to this. Keep reading for details.
Computers Are Just a Fad
At least that is the impression I get from the curriculum document. The charge for the revisions was to create a curriculum for the 21st century. However, the only mention of technology in the entire document comes in the FAQ at the end. Here is what they say:
At least they see this as a weakness of their proposal, and they acknowledge the importance of technology. However, they seem to think that faculty will somehow magically start incorporating this into the classroom and that students are certain to take courses that use the needed technology. The reality is that college faculty, to a large extent, as some of the least technologically savvy people on the planet. What is more, I frequently see students who work to avoid technology in the same way that they avoid math and science. This is a bad decision on their part, and most will realize it later in life. Part of why students pay to go to college is so that other people can give them direction and help them avoid making those bad decisions. In my opinion, as this curriculum currently stands, it fails miserably in this area.2. If we are trying to educate students for the 21st century, why isn’t technological and information literacy part of the capacities?Answer for technological literacy:Our committee agrees that the ability to use, understand, and criticize technology is of tremendous importance. Technological advances flow into the classroom as they become relevant to educational content and delivery, and we are confident that Trinity faculty bring these technologies (as well as a thoughtful discussion about their strengths and limitations) into their courses.Answer for information literacy:Information literacy is a hallmark of a Trinity education through the university’s commitment to the QEP. It was felt that most, if not all, of our classes support and reinforce information literacy.
Globalization: The Game Changer of the Last Several Decades
So what does this curriculum do instead? There are changes in it. One of the big ones is a push to address globalization. This includes a "capacity" with courses on "Global Awareness", "Understanding Diversity", and "Foreign Language". This is on top of the standard elements where you have to be able to read, write, and speak as well as a smattering of courses from humanities, social sciences, natural sciences, and math. The new part is dealing with being an "engaged citizen", which seems to be largely motivated by a desire to have Trinity students prepared for globalization.
In my opinion, globalization is yesterday's news. I made the subject of this refer to 1992 because honestly, a really forward looking curriculum would have included globalization back then. Now this is just a knee-jerk reaction to a boat that was missed two decades ago. Globalization was perhaps the biggest influence on our economy and the general evolution of the world over the few decades up to 2010. However, it isn't what is going to shape the future. Yes, global exchange of information is going to continue to be important, but production of goods is on the verge of heading back the other way. New approaches to digital manufacturing, including 3-D printing and increase automation, are making it possible to put the production of goods back close to the point of consumption. After all, why should we ship materials to China, have them assembled there, and ship them back if they can be assembled here? For the past few decades the answer was that assembling them here cost too much. However, today even Chinese companies like Foxconn are planning to replace their human workers with robots. Those robots aren't any cheaper to run in China than they are here. However, energy costs for transportation are only going up. So at a certain point (which I expect is <10 years from now) you cross a point where you want to put the robots close to the end consumer and make things as close as possible to where they will go in the end.
In addition, technology is significantly minimizing the need to actually speak a foreign language to have a dialog with someone who doesn't speak your language. Today Google Translate can allow me to have a reasonably fluid conversation with someone who speaks a different language, and the quality of translation and speech understanding is improving by leaps and bounds. If you have seen comparisons between Google Now! and Siri, you can see what one extra year of development means in this space. I fully expect that by 2022 I will be able to speak to someone in almost any language in a manner that is very close to natural without knowing that language. This isn't to say that there aren't cognitive benefits to learning a foreign, natural language. It is just to say that interpersonal communication is going to cease to be one of those benefits.
If Not Globalization, Then What?
So what do I think is the game changer of the coming decades? What should our new curriculum aim for? The paragraphs above should make this fairly clear. Globalization is going to take a back seat to the oncoming surge of digital technologies that will be enabled by machine learning based AIs and automation. It is impossible to predict exactly what will be relevant, but based on what is already out there you can feel pretty confident that in 2022 most students will have cars that drive themselves and many of them will have robots at home that cook and clean. (Sound Sci-Fi? Then you need to follow me on Google+ or at least go search for videos of those things on YouTube because they are feasible today and will be cheap enough to be wide spread in a decade.)
There are other things that are becoming increasingly significant as well. The buzzword of "big data" is everywhere for a reason. In addition, the rollout of IPv6 wasn't much hyped, but there are rumblings of the beginning of the internet of things if you look in the right places to hear them. When your shirt has an IP address and is constantly sending information about your temperature and heart rate into the cloud for analysis, then you will begin to understand what these things are. They are primed to change the way we live in dramatic ways.
What does this mean for the curriculum? My take is that if a graduate of 2022 looks at a computer and seeing a magic black box with pretty pictures on it, that graduate has already lost at life. They are a powerless consumer with no ability to produce in the markets that will define their time. If we let them become that, we have failed the trust that they put in us when they enroll in our school.
My Proposal
So what do we do about this? What do I think the curriculum document should have included? First, let me tell you what it should not have included. It should not require that every student take a CS course specifically aimed at teaching students to program. That would be a nightmare for me on many different levels. In addition, it wouldn't really benefit the students. Some students need to know how to really code. Those students can learn about programming language fundamentals without associated context. For the vast majority of students though, they need to learn how to use computers to solve problems with at least slightly more competence than just using pre-written software.
Increasingly, data is what drives the world. Humans are horrible at manipulating even reasonable amounts of data. Computers are great at it. The graduate of 2022 should have seen the data associated with courses in a number of different departments and they should have had to do something beyond just plugging it into existing software to dig for meaning or answer questions based on that data. They need to have some experience using a computer and associated technologies to solve problems. That is what really matters. They need the skills to turn the computer into a tool that they can use to solve problems that are beyond what they can do alone.
I believe the best way to do this is to require that students take a few courses that require them to do computer based problem solving. The ideal situation would be that courses that normally count for 3 hours in departments all across the University add an extra hour of credit and a computational problem solving component. For example, a course in Political Science could ask students to analyze census data or data from the last presidential election. Have the students answer questions that aren't simple queries in Excel. That way they might learn how to write VB Script and do a little logic to solve the problems. Or maybe the questions you want to answer are well suited to some reasonable SQL queries. Sometimes the right approach might be writing scripts in Python, Perl, and Scala. The details don't matter to me. The details should be chosen to fit the data set and the questions being asked about it. What matters is that students learn how to make technology do what they want it to instead of acting as passive consumers of software that someone else has written.
I've always liked the expression that if your only tool is a hammer, every problem looks like a nail. All too often, I see people do things the hard way because they don't know there is an easy way. Even worse, I see people who can't even conceive of certain questions because the tools they know don't allow them to answer those questions. If our graduates fall into either of those categories, we have failed them. I don't want to see that happen. A major curricular review is the time to make sure we do things right. Unfortunately, I don't think the current proposal is doing that.
Call to Action
So I want to close with a little call to action for any Trinity alumni out there who are reading this. If having the ability to control technology and make it do what you want to solve problems has benefited you in life, let your old faculty know. Take a minute to tell them how skills like programming have benefited you in your life and what things you wouldn't be able to do without those skills. You might even just pass on a link to this if you don't want to write much yourself. In addition, forward this to other alumni so that they too might help the faculty at Trinity to see that computers are not a fad that is going away and that being able to bend and manipulate technology to your benefit to solve problems really is a valuable skill that everyone needs to have.
Subscribe to:
Posts (Atom)