Sunday, November 25, 2012

The Fallacy of "Deep" Courses and Workloads


This is back on the proposal that is being considered at Trinity to change student course loads to 4 courses per semester, each worth 4 credits (4-4) and the associated change in teaching load to five courses each year with 3 one semester and 2 the next (3-2). I am writing this blog post because I am finally getting fed up with statements that I keep hearing about how some courses simply require more time than others, especially time spent outside of class. There is also an associated aspect of this, where the same people who make these statements insinuate that this extra time is needed for their courses because they are too "deep" for a normal 3 credits.

Why I am Upset

Here is the problem, you don't hear these things coming from STEM faculty. You know, the people teaching the science and math courses that most students consider to be the hardest courses on campus. The majority of STEM faculty seem to be quite happy with how things are. The push for change seems to be coming largely from the humanities and social sciences.

I guess what really offends me is that these statements imply that because I am happy with my course counting for 3-credits that somehow my course is easier and less deep. I'm sorry, but I have had many students tell me that the courses I teach are the hardest ones they take during their entire time at Trinity. My courses also keep students very busy outside of the classroom. My guess is that most of the students in my courses will spend more time on my class than on any of the other courses they are taking. In fact, when students can't do that they tend to wind up withdrawing from my course.

My courses aren't time consuming because they are full of busy work either. They are time consuming because students have to wrap their heads around completely new ways of thinking, and they have to learn to break down problems to levels they have never done before. Then they are forced to apply those capabilities and they are forced to make things work. My courses aren't fluff. They are rigorous and challenging and it is really getting on my nerves how so many faculty seem to be saying that their courses are harder than mine and hence need to count for more credits.

This doesn't just apply to my courses either. Anyone who tells you that courses like E&M, Quantum Mechanics, or Complex Analysis don't require deep thinking or much time spent outside of class clearly has no idea what he/she is talking about. They have obviously not spent time trying to picture a vector field flowing through a Gauss pillbox.

If people really want to get some data on what departments have more challenging courses, and which ones require more time and effort both inside and outside of class, there is a publicly available data set for that called They list ease as a criteria. If you scan through it for faculty with an ease rating of 2.5 or less, you will notice that STEM faculty are extremely over represented. (So is the English department.)

(Here is a coding assignment for any of my students reading this. Write a program to scrape data from I'm most interested in department and ease for this topic, though it would be good to have names so that people who aren't current faculty can be removed.)

Enforcing Work

My gut feeling is that faculty who think they need students to take fewer courses and want students to have more time outside of class really just need to find better ways of enforcing work done outside of class. The idea that students are booked solid with academics when taking five 3-credit courses is absurd. Trinity students spend lots of time doing many things that aren't academic. Give them more time outside of class and they will use it for a variety of non-academic activities unless you can enforce that they use it doing the work for your class.

The fallacy that STEM courses somehow require less time outside of class is absurd. The reality is that STEM courses typically do a much better job of enforcing that students actually do what they are supposed to do outside of class. I can give my students assignments and exercises and if they don't take the time to learn the material, they will be completely incapable of doing those assignments. No, my students don't read everything I assign. However, they will read enough to be able to write the programs I ask them to do. (Honestly, many students would probably spend a little less time on my class if they would do the reading up front instead of wasting hours trying to code before they understand what they are doing.) I know this is more challenging for many non-STEM courses, but that doesn't mean it is impossible.

The bottom line, as I see it, is that changing the number of hours/credits for a course doesn't make students do more work, and making new policies based on the idea that it will is a great way to reduce the quality of a Trinity education. If faculty want students to do more work on their course, they need to be inventive and creative and find ways to enforce students doing the required work. That is the only way to make change happen.


I'd love to hear back from anyone who reads this and is willing to say which courses they had in college that challenged them the most or kept them busy the most. I'd also love to hear why. This is especially true for Trinity students.

(Update: I would like to thank Laura Gibbs for pointing out the National Survey of Student Engagement in the Google+ discussion of this post. Ideally students should be spending 2 hours outside of classes for every hour inside of class. This survey shows that students spend between 12 and 18 hours total prepping for class. In other words, they aren't even close to the 2-hour mark. If faculty want students to work harder, the reality is that most students have time in their schedules. They just need to be forced to take the time.)

Aside: A Useful Technique

I'll close with a technique that I learned in grad school for Amer Diwan who taught a course on program analysis. This course was all about reading journal articles. I have used this approach to good effect in similar courses at Trinity. Make students show up to class with written questions on the reading. If you don't want to do questions, have them write a short paragraph instead. Base the in class discussion on the questions students hand to you when they first walk in. Have a portion of the semester grade come from the quality of what the students write for this. Call students out when what they provide sucks and shows that they didn't really put in the effort.

Saturday, October 20, 2012

Problems with the 4-4 Student Load

As I have mentioned previously, Trinity is considering changing from the current student load of ~5 courses each semester to a load of ~4 courses each semester. This alternate configuration is called a 4-4 student load as each student normally takes 4 courses, each of which is 4 hours of credit. A related proposal is to reduce the teaching load from the current 3-3, which each faculty member teaches three courses each semester, to a 3-2 teaching load where faculty alternate between 3 and 2 courses. In general I am opposed to both of these changes, but I have to admit that my opposition is based largely on thought experiments and imagined consequences instead of empirical data.

This weekend I got the chance to talk to someone who teaches at Southwestern University. They made the change from a 5-5 student load and 3-3 teaching load to 4-4 and 3-2 a few years ago. So this faculty member has direct experience with both of these systems. I wanted to record what we talked about and her perspective of that change here, because I felt that she had some very good insights.

Lack of Student Flexibility
The #1 problem that she described was something I hadn't even thought of, a lack of student flexibility in scheduling. In a 4-4 student load situation, students really need to take 4 courses each and every semester. The reason being that it isn't feasible for most students to go up to 5 courses when each one is four hours, and if you have more than one or two semesters with only three courses, you won't graduate on time.

All faculty know that occasionally students get in over their heads or sign up for courses they really aren't prepared to take. Under a 4-4 scheme, these students really can't drop those courses without pushing back their graduation. In the case where students choose to register for only 3 courses originally and take a light load, they have an even worse problem if it turns out that one course causes them problems because then dropping to two courses can cause problems related to full-time enrollment for the year. That leads to all types of financial difficulties for most students.

Under the category of lacking flexibility, Southwestern also runs into problems when it comes to transfer students and transfer credit. Given the challenges of enrolling students, transfers are potentially very important to many liberal arts schools. How do you count the 3-hour credits that most transfer students will come in with? Similarly, many Trinity students take summer courses away from Trinity and the same it true for Southwestern. We can't give students 4-hours of credit for a 3-hour summer course taken elsewhere. So we might check off a requirement for them, but they run into problems when it comes to total hours. Here again you can have students who fail to graduate on time because they don't have the right number of hours. With a 4-4 configuration you simply lose the flexibility for students to go slightly above the normal requirements to offset deficiencies.

Too Few Courses
Closely related to the problem of student enrollment flexibility is the problem of course offering flexibility. The faculty member I talked to noted that her department (a STEM department) was forced to reduce their major to 10 courses. This reduces the number of electives that students take as part of the major and how many electives can be offered. Not only are there fewer faculty teaching slots for electives, students don't take many so it is hard to get a critical mass of students to validate offering them.

An odd side effect of having majors cut down to 10 courses was that some departments bend the rule by hiding requirements in prerequisites. In particular she mentioned that the physics department, in order to get under the 10 course limit, doesn't explicitly list any math requirements. Instead, they have math courses as prerequisites on certain physics courses, making them implicit requirements. I know that Trinity highly frowns on implicit requirements, and the University Curriculum Committee typically rejects any such proposal. However, some fields truly do have a need to include more courses, especially when outside requirements are included.

Caps on majors or just the limits to courses could cause problems for things like theses as well. The CS department at Trinity does a 3-semester honors thesis track. There is no way we can do three semesters when students only take four courses each semester. This is definitely one of those situations where two, four hour courses are not even close to the equivalent of three, three hour courses.

The last problem presented by reduced course flexibility with the change made at Southwestern is in the inability to offer short seminars and the like on topics of interest. There modified system does not nicely support the equivalent of one and two hour seminars or independent studies. This can make it much harder to support undergraduate student research.

Courses Didn't Increase in Difficulty
The primary arguments for the 4-4 student load is that students are overburdened by having five courses each semester and that courses could be more rigorous if students only took four. I have always felt that this argument falls flat. Students spend a lot of time doing things outside of academics. If faculty members really want their students to dedicate more time to their course, they simply need to make the course harder and find ways to enforce that students really do the work. It might not be easy or even obvious how to do it, but that is what needs to be done. If faculty can't find ways to enforce students doing the work, moving to the 4-4 model isn't going to help.

Indeed, the Southwestern faculty member said that my fears match what has happened there. Few faculty have actually made their courses more rigorous. What is worse, because most of the courses went to 4-credits without going up to 4-hours, she feels that students are actually spending less time working on academics. Why? Because students now only have 12-hours in class. So when they look at their schedule they see even more "free time" and they tend to book it for things like jobs, sports, or other extra-curricular activities. Once they have done that, they truly don't have the time to complete extra rigour even if faculty members step up and make their courses more rigorous.

The reality is that you have to change that campus mentality toward courses and course work and that is more important than how many courses students take or how many hours they meet. Apparently Southwestern is experiencing most of what I see as the worst possibilities of moving to a 4-4 and virtually none of the benefits. However, because they went down to a 3-2 teaching load, faculty see a benefit so it will be nearly impossible to switch back.

Adjuncts and Conclusions
One last odd problem that Southwestern has run into is that challenge in hiring adjunct faculty. That can be a challenging process in many departments when asking them to teach a three credit course. Asking them to teach a four hour course makes it harder. If they are teaching a course that only meets three hours, but is supposed to have a harder workload, it is very unlikely that they will require the desired level of effort.

The general conclusion from this faculty member was that she couldn't find anything good to say about the 4-4 student load at Southwestern. Only the negatives of the change have been manifest in the implementation. The same is almost true of the 3-2 teaching load with the minor exception that there is some small benefit to having the freedom of picking when the 2-course semester is done. However, in practice Trinity does not appear to be extremely strict about making certain every faculty member teaches three courses every semester so this is not really a practical benefit.

Friday, October 19, 2012

Automated Exercises for Code Tracing

One of my goals for my courses is to have students do a lot more exercises. I am already putting up videos to go along with the book material, but students need to apply what they watch. (There also needs to be a way to enforce that they watch/read things.) For writing code this isn't all that complex other than having a tool to do it. I have been planning to write such a tool. There are other tools that do this as well, but I don't know of any that use Scala. I want to write something that is language independent so that I can use it across all of my classes.

I also value having students trace code. Of course, if I just show students code and ask them what it evaluates to/prints when they are in their room, they can simply type it in and run it. That doesn't benefit them or force them to work on the skills I want them to develop. This post describes an idea I had in regards to this. If anyone happens to read this who writes CS educational tools, feel free to take the idea and implement it yourself. Part of the motivation for this idea came from the stack trace display in DrRacket.

The basic idea is that you have students indicate where the flow of control goes and possibly have them enter values for variables along the way. The goal is to force them to actually demonstrate that they followed the code. That way, even if they do type it in and run it, they still have to think things through.

The figure below shows a mock-up of what the simplest interface might look like. The instructor would provide the code and the student has to click on each line in the order that control passes through them. Not only does this force the student to think through what is happening, it can be verified automatically so that students can be told if they have the proper trace without taking additional instructor time. That is essential if you actually want to use this to force students to do things in a real setting.

The figure above is a rather minimal example of tracing as it only follows the control flow. For some problems this might be sufficient for demonstrating that the student knows what is going on. However, for many problems it would be advantageous to have the student also track values attached to names/variables. The figure below shows a very ugly, first draft of what such an interface might look like. This is something that would need a lot more interface work, but the idea is to show the same control flow information, but also have tables that show the values of variables and how they have changed over time.
This would work for recursive functions too. As shown here, you have a table with a large column for each variable/name. You could also have separate tables for the different stack frames. I can also picture a more dynamic representation where there is only one row drawn for each stack frame, and the program records the values in that for each step in the computation. Then the student or someone else looking at what the student did could select an arrow to see the value of the stack frame at that time.

Does anyone know of tools like this? Would anyone be interested in having something like this? Personally, I really like to get students to trace through code and I can't do nearly enough of it. I feel that a tool like this could be very beneficial in the CS1 setting and even in some parts of CS2 and Data Structures.

Thursday, October 11, 2012

Where I See Us Heading

Right now everyone seems to be debating politics and talking about why the candidate they dislike is so bad. A few people are also talking about why their candidate is any good. So I thought I would step to the side and write a few words about my view of what the near future holds, and why I think both major candidates, and indeed both major parties in the US, are pretty much worthless.

Everything centers on the economy and both parties say they want to get people back to work. My take is that both will fail, and the efforts they put into it are not only futile, but likely a waste of resources. The reason, is that technology has made many jobs obsolete, and it is going to make many more jobs obsolete in the near future.

I know that the standard economist response to technological unemployment is simply to point to the Luddite fallacy and say that technology has only created more jobs all through history. I have another blog post in preparation where I will argue that this is one place where you can't just go off history. Pure logic can show you that the Luddite fallacy breaks down at some point. So the only question is, are we at that point?

I would also point out that I don't think there is anything fundamentally bad about there not being enough jobs. There is nothing about the human condition which I feel implies we have to work to have a good life, at least not the type of work we do for money in the modern world. If you enjoy gardening keep doing it. However, that doesn't mean that humans have to be employed in the manual labor of growing food when they don't feel like it. Similar logic applies to many other areas.

I write this post just after the September 2012 unemployment figures have come out. The U3 unemployment level for the US sits at 7.8%. This is actually a pretty significant drop from previous months. Here you can see a plot from Google's public data explorer showing this.
This seems like good news. However, the U3 doesn't count lots of people. The U6 is a much better measure and it didn't budge in September 2012, holding steady at 14.7%
Probably the best measure of what is happening with jobs is just looking at the labor force. Both political parties are promising to "make more jobs". That is what has to happen if you want to really drop unemployment. You have to have more jobs created than what is needed to balance the increase in population. Here is what Google public data explorer can tell you about the number of people working in the US. This is a plot of the labor force. Note that it hasn't budged since 2008.
The population has been changing though. The plot below of data from the US Census Bureau shows that the population simply keeps rising even as the labor force has completely flat lined.
In many ways the US is doing better than other parts of the developed world. I heard a report on NPR yesterday that Spain is up to 25% unemployment with the rate being over 50% for those in their 20s. I don't know if the "unemployment" they measure is closer to our U3 or U6, or if it is something completely different. However, those are pretty staggering number.

There are lots of reasons why things are this way today and technological unemployment is not the primary cause, IMO. However, it is growing in importance, and it is why I don't expect things to turn around. In fact, they will get worse. I think it will also break some standard economic ideas. For example, "trickle down" effects disappear if you stop paying money to people lower on the wage scale because you automate away all of their jobs. This line of reasoning could go into a whole different direction looking at things like median wages and quartiles vs. corporate profits, but that isn't the objective of this post.

While politicians and their fans can point fingers all they want about why the labor force isn't growing, none of them wants to tell the real truth. This is because the truth is that they are impotent to change things. Businesses today don't really need many people. They especially don't need people with the skills that most people in this country have. Instead, they can get more done by investing in technology and augmenting the employees that they have. Sometimes, that technology can even replace those people. I'll point the interested reader to two articles here. My Google+ feed is full of such things, but these two should be sufficient for this post.

First, I point to a recent article in the NY Times, "When Job-Creation Engines Stop at Just One". The rhetoric I hear from the GOP is that if we reduce the barriers to starting companies, those companies will make new jobs. I will point to this article and disagree. Most start ups are forming today with a lot fewer employers than they used to. I see no reason to believe that removing barriers to forming businesses would change this. It simply costs less to use technology and contract workers than it does to hire full time employees.

The second article is a bit older. It is called "9.2% Unemployment? Blame Microsoft.". It appeared on the Forbes site in the second half of 2011. Clearly the U3 unemployment rate has fallen since then, but as you can see from the charts above, that isn't because the labor force has grown. There is one particular quote from this article that I want to highlight.
So here’s the cold, hard truth about why you're unemployed:  most businesses don't need you any more. We can do just as much, if not more, without you.
This is the real reason why politicians can make all types of promises about employment and jobs, but none of their promises will come true. There is no policy change that can make people more useful to businesses overnight. The closest you can come to that would be to make people more useful per dollar spent. That would be policies like removing forced benefits and dropping minimum wage so that employers can pay human employees significantly less than they do today. This is something the GOP would actually be willing to do. Even that is a stop-gap measure that will prove ineffective in the longer term. Education is far more beneficial if you go beyond a few years, but even that has limited effectiveness in the long run. Education is something the Democrats will support. Neither is going to push for the changes I think we need to deal with what I'm going to describe below.

Before I start looking forward a bit, I want to point out one more link. There is to a page at the Bureau of Labor Statistics called "Employment and wages for the largest and smallest occupations".  I will refer back to the top table on this page in my points below. Note that the top 15 occupations in the US account for 27% of the labor force. If you were to wipe out most of these, even if not a single other job category was hit, our unemployment rate would look at lot more like that of Spain. So now let's look at why I think that is going to happen, and how technology that will come out at different points in the future is going to drive it.

1-5 years
What should probably scare you the most about that list of largest jobs is that the #1 item on the list is already under heavy assault. Unfortunately, the BLS list is for May 2011 and they haven't come out with a newer one. When this came out, there were still people employed at Borders. Some might argue that you can't point to a single employer because companies go under all the time. However, no one should argue that online sales aren't hitting retail sales in a huge way. The plot below from WolframAlpha only goes to 2009, but you can see the trend there and that was even longer before a giant like Borders fell.
Retail isn't going to go away, but it is going to shrink. Amazon doesn't employ people to tell you about their new products or run the cash registers. Speaking of running cash registers, that is #2 on the list and that is in the same boat.

Of course, some people like to go handle things and try them on. One might argue that makes things like clothing retailers safe. The fact that they aren't is brought home by this article on virtual dressing rooms from CES 2012. Within 5 years I fully expect that many people will be ordering clothes that fit them perfectly well, in fact better than what they could have gotten in a store, using a system like this.

Another area that is going to undergo significant change in the next five years is health care. I don't know how much it will hit the ranks of RNs who come in at #5 on the employment list, but if personalized health leads to fewer in-person visits, that will lead to reduced hiring for people to see those patients. This is an area that is about ready to "take off" as described in "When will data-powered personalized health hit ‘escape velocity’?" This is only one piece in a big puzzle that is going to dramatically disrupt health care. Given how much money goes into health care, and how big a problem that is for the government and every other segment of business, disrupting health care in a way that brings down costs has to be a net benefit, even if it includes putting a good fraction of the 2% of the labor force out of their current positions.

Right above RNs is food prep. It is a common joke to say that students who finish college with a degree in something that doesn't translate to a job will be asking, "Do you want fries with that?" On a more practical side, fast food employs a significant number of people and it often is a safe type of job position or a last resort for many people, including those without the skills to go on to other jobs. Don't expect that to be available in 5 years based on what you can read in "Hamburgers, Coffee, Guitars, and Cars: A Report from Lemnos Labs". The founder of Momentum Machines didn't go to the PR training school where they tell you that you never say your product is going to put people out of work, no matter how true that statement is. As a result, we get this great quote.
“Our device isn't meant to make employees more efficient,” said co-founder Alexandros Vardakostas. “It's meant to completely obviate them.”
(I discovered this article through Martin Ford's blog post, "Fast Food Robotics - An Update".)

Since I mentioned Amazon above, and because they are so disruptive for the current workforce, I will close out the 1-5 year future with more about what they are doing right now. They bought Kiva systems so that they can completely automate their warehouses. That is opening the door for them to take the huge step into giving customers instant gratification. In less than 5 years, Amazon will have a warehouse in the nearest major metro to your house, and they will use heavy automation to get your orders to you fast. Next day will be the slow option. That will be just one more reason that retail will be less needed, as will retail employees.

5-10 years
Moving out to the 5-10 year range, robots beyond Kiva systems are going to start making an impact. Janitors are smack in the middle of the top 15 list. They, and others doing similar jobs in human environments, have been safe from automation because the robots of the past has been fast and accurate in controlled environments. They don't play well with humans and they don't deal well with the unexpected situations that inevitably occur in environments that humans spend time in. Those limitations are going away.

If you don't believe me, go to YouTube and search for robots cooking or robots folding towels. A lot of what you will find is research work taking place with the PR2 robots from Willow Garage. PR2 is probably the most broadly used robot in this space, but there are quite a few others. What they all have in common is advanced vision and AI systems that allow them to handle the unexpected. They are also built to work well with humans, unlike the robots used in places like automotive plants which will do serious damage to any humans who accidentally get in the way of what the robots are doing.

I have to push this out to the 5-10 year range because right now it is a bit too expensive. No one is going to put a PR2 in their house at $400,000. However, the Baxter system, at $20,000, is an example of how companies are working to bring that down. (Note that the Baxter spokespeople have done "proper" training and they assure you in the article that no jobs will be lost to their robots. Their reasons have some merit, but that doesn't change the fact that in the longer run their robots will prevent humans from being hired to do things.) There are several efforts that are looking to get personal robotics off the ground and they have some serious money behind them.

Construction and manufacturing don't make the top 15 list, but they are definitely significant in the US economy. I actually expect that a lot of manufacturing will be re-shored and that construction should tick up as well. However, the manufacturing will be done without humans either by employing robots like Baxter or more advanced, articulate machines, or through additive manufacturing/3-D printing. That is another field that is on the verge of taking off with the output quality for devices that are in the $2000 range improving dramatically this year.

3-D printing is going to touch construction too. Want a new house? Instead of hiring people to build it, there are people who are working it make it so you can print your house.

10-15 years

Beyond 10 years things get harder to predict, but as all of these technologies mature, the ability for companies to do things without hiring humans is going to grow dramatically. There is one big technology that is slated to become widely available in about 8 years, that I fully expect to hit two more parts of the top 15 job list in just over 10 years: autonomous cars.

Hopefully everyone reading this knows about Google's autonomous cars, which have now driven over 300,000 miles. Both Nevada and California have passed laws to make it legal for these cars to operate when they become commercially available. Since all the major car makers are working on competing technology, it will become commercially available. GM has said autonomous driving will be standard equipment by 2020. I don't think it will be many years beyond that before the jobs that involve driving simply disappear.

What To Do?
So if you buy my arguments from above, the real unemployment level in the US is going up and there is nothing that politicians can really do to prevent it. So what should we do about this? First off, don't ignore it. That is what politicians are currently doing. It makes sense. After all, who is going to vote for the guy that says the jobs are gone, they aren't coming back and, by the way, more are going to be disappearing. No one wants to hear that, even if it is the truth. That's why it falls to people like me who aren't running for political office to say this.

The reality is that we shouldn't bury our heads in the sand though. We need to address this challenge and try to come up with ways that society can be organized where everyone can live a decent life even if large fractions of the population have no jobs because there is nothing productive for them to do in production of good. This is the search for a post-scarcity social organization. I don't claim to have a great answer to that, and even if I did, I wouldn't make this post longer by placing it here. However, I truly believe that the right way forward involves dramatic social changes and that the most important thing one can do to make that happen is to get people thinking about this.

So think about this some yourself, post comments to discuss with me and others, then share this with anyone you feel is willing to think about it. Oh yeah, then vote for some 3rd party candidate to protest the two big ones.  :)

Saturday, September 29, 2012

Appropriate use of "free time"

I am a big fan of the expression "Race against the machines". I really do think that is where the job market is these days. However, we are still in the early parts of the exponential explosion. For that reason, many people don't see this as a race against the machines. Instead, it is more of a race against other humans. That isn't simply perception, it is also largely reality. People can race against the machines for a while, but that isn't a race you can really win. The machines improve exponentially. By the time you realize you need to start running, there isn't much time left before you will be completely left in the dust. Instead, humans are racing one another for those positions that the machines can't yet do. They are also racing to positions that are currently machine free in the situations where their previous knowledge/capability falls behind in another area.

One of the other things that I have been watching a fair bit recently is the free availability of educational resources online. Just a few years ago, there really weren't all that many options when it came to learning most topics. You could buy a book or spend a lot of money looking for someone to teach you using traditional educational approaches. Some resources were available online, but they were expensive. Just between Khan Academy and the MOOC (Massive, Open, Online Course) sites (Coursera, Udacity, and edX), there is now a wealth of educational material present at a variety of levels.

So what does this have to do with free time? Well, if you spend all of your free time doing activities that don't lead to self-improvement, you are losing the race against other humans and the machines. I feel like the US society has a mentality that chases entertainment and leisure. There is no doubt that people need to have both entertainment and leisure in order to maintain good mental balance, but I worry that most of the entertainment and leisure that people strive for does nothing to improve them and that we have made entertainment the ultimate goal of our life quests. If you want to have any chance of not losing the race, you need to strive to improve yourself. More broadly, I think that we need to work on being a society that seeks for self-improvement, both physically and mentally instead of one that seeks entertainment. I mentioned some mental self-improvement options above. On the physical side, Fitocracy is one of many tools you can use to help motivate you to do exercise and in the end the goal is to turn self-improvement into something more akin to entertainment.

So who should be doing this? I would argue everyone. In my first blog post looking at the course I am taking at Coursera, I argued that MOOCs are great for continuing education for educators and many others. Students who want to go into more depth in a field or who find that they lack certain background in something should definitely consider turning to either MOOCs or resources like Khan academy.

These example groups skirt around the reality of the modern US and the fact there there is a large population of people who are currently losing the race and who really need to focus on self-improvement. That is the population of unemployed and underemployed. As it stands, the labor force in the US is below 50% of the population and dropping. That puts it back in the range it was in during the 1980s when many women still hadn't moved into the workforce.

In most segments of the economy, this means that there is very stiff competition for what positions there are available. That in turn means employers can be selective and only take the top people. If you aren't one of the top people right now, there is only one way to get there, that is through self-improvement. In the past this might have been some type of cruel catch-22 because self-improvement always came at a monetary cost, whether it was tuition for school or just the price of books you might buy. Today, those barriers are gone. Anyone can sign up for a MOOC for free and listen to lectures from some of the worlds greatest minds. If they start going over your head at some point during the semester, that's fine, you aren't contractually bound to go through the whole course and you lose nothing if you stop at the point where it becomes too difficult. Now the only cost is the cost of the time you take in doing these things.

Unfortunately, what I think is happening more often in our society is that instead of going for the challenge of self-improvement, people go for the mindless joy of Facebook games (and other similar substitutes). I can understand the temptation. I got sucked into a mindless FB game once and I have to admit that I am occasionally tempted to do such things when I feel overwhelmed. (The only reason I don't is because playing them wouldn't make me any less overwhelmed. In fact, while I was playing odds are good that more stuff would have gotten piled on top of me.) The thing is that FB games, pretty much uniformly, do not improve you or get you ahead in the race. As a result, the time that a person spends playing them causes that person to fall further behind those he/she is racing against.

Now let's be honest, I do believe that automation is causing technological unemployment, and in the long term, the nature of jobs and their role in our society needs to fundamentally change. The cynical side of me says that in the future we might actually want a large fraction of the population playing mindless FB games so that they stay happy and don't riot. However, we aren't to that point yet and we still live in a world where people need to work if they want to live a decent life. As such, everyone should be doing as much as they can to improve themselves to move ahead in the race. Everyone should take advantage of the new opportunities for education and other forms of self-improvement. They should do so first with the goal of making their own lives better. People that I know who are in the position of hiring others love seeing self-motivation and self-improvement. They will ask you in an interview what you have been doing with your down time, and if the answer isn't something they see as useful it is a major black mark on your application. However, I think there is a second reason.

When I look forward toward a world where most people don't need to work for things to run smoothly to run, I see a world where we need people to be motivated by different things. If everyone seeks mindless entertainment and leisure, I don't think think society will work well. It either falls apart completely, or we get a Wall-E type of scenario. However, if we can change our societal norms to become a society that strives for self-improvement, I think that a world where we don't need jobs works very well. People in such a society will continue to work to educate themselves. They will put effort into creative endeavors. Inevitably, they should still enjoy the occasional period of rest or activities that are purely for entertainment and leisure, but we need those activities to be what people do occasionally to recharge and to gain motivation for the next step in their life path, not to be the ultimate goal that people work towards.

Tuesday, September 11, 2012

Proposal for a Technology Capacity in the Trinity Curriculum

This is a follow on to my previous blog post about the curricular proposal that recently came out at Trinity. On Friday 9-7-2012 there was a meeting of the faculty to discuss the proposal. It came out fairly quickly that the committee who designed the proposal acknowledged the not addressing technology was a hole in the current proposal and that it needed to be addressed. They also mentioned an addendum for a "Capacity" in technology that had been given to them shortly after the first time they had presented the proposal to another group of faculty (the University Curriculum Counsel), but they hadn't had sufficient time to integrate it, or even decide if they liked it. In this blog post I want to lay out my idea for how technology should be incorporated, which I believe mirrors the existing addendum closely. I will provide my reasoning, more details than are in that addendum, and also consider the feasibility based on faculty resources.

I want to start off with a quick recap of why I think that including proper technological proficiency in Trinity graduates is essential today and will grow in importance moving forward.
  • The world is run by computers and technology. Digital processors running software control your financial transactions, your utilities, your car, and pretty much everything else you use on a daily basis.
  • This is an area where primary and secondary schooling utterly fails. For example, students in Texas are required to take 4 years of English, Math, Science, and Social Studies. For foreign language it is a 2-year requirement and the higher level graduation plan needs three years. One can debate how effective their efforts are, but there is some foundation in all of these areas. On the other hand, Texas students are required to take NOTHING related to understanding and utilizing technology. (In fact, the way it is included in the curriculum discourages the types of students who would attend Trinity from taking it.) In a way, this makes technology education at the college level something of a remedial subject. However, it is clearly important to the future world and we have to make sure our students aren't completely ignorant at graduation.
  • Computers are tools that can solve certain problems that humans are very poor at. This isn't really about technology. This is about problem solving, and having enough knowledge to be able to identify, and hopefully use, the proper tool for solving various problems. With the growth in data sets, especially publicly available data sets, finding correct answer to more and more problems is becoming something that computers are really good at it and humans aren't.
So what should our technology requirement be asking student to do? You can probably tell from what I wrote above that my real goal is to get students to the level of proficiency that they can use a computer to solve problems in fields that are relevant to them which are not easily solved by humans by hand. Most of these problems involve either numerics or data processing of a scale that simply puts them outside the reach of unaided humans, but they are things that a computer can finish almost instantaneously if given the right instructions.

I think that by aiming for this objective, these courses will implicitly give students another thing that I feel is absolutely essential, a sufficient comfort level in working with technology. Comfort in working with technology is essential for so many aspects of modern life, and as computing power becomes more ubiquitous, that is only going to grow. However, I don't think this is what courses should focus on. This should instead be something that falls out of the mix when we force students to use technology to solve other problems.

For me, the ideal way to do this involves programming. It doesn't have to be serious programming, but it needs to involve putting together logic using the basic logic structures that have been part of the programmers toolkit for decades. I would argue that learning how to program changes the way a person views every problem they encounter in a way that is far more fundamental than learning a foreign language. When you program, you have to really break a problem down and figure out how to express that problem in terms that a computer can understand. So programming is, in many ways, a translation problem. You translate from normal human terms to the more formal terms and syntax of a programming language.

While I think that the programming part is critical, the way in which it is done is far less important to me and should be selected to agree with the problem area. At the faculty meeting to discuss this, someone made a negative comment about courses teaching Excel. If all a course taught was basic Excel, I would agree. However, there is a lot to Excel that goes beyond the basics. Since the goal is to focus on using technology to solve problems, and the problems should be of sufficient complexity that basic Excel won't do it, I would be perfectly happy with a course that uses Excel and has students write VB Script to implement more complex logic. Indeed, if the data sets that are associated with that course/topic tend to be tables of data, they probably come in either Excel or CSV format anyway, and then Excel isn't just a suitable choice, it probably becomes the ideal choice. (Other spreadsheets would work too. For example, the spreadsheet in Google Docs also has a scripting environment.)

The reality is that tools, whether they be Excel or something else, change over time. That is part of the nature of technology. That is also why courses should not focus just on tools. If a student takes a course in his/her first year and that course focuses only on tool usage, it is possible that tool won't even be available or supported by graduation. However, whatever other tools will inevitably be used to solve those problems will inevitably use the basic knowledge of programming/scripting. So this skill/knowledge translates well across pretty much all tools because the nature of programming has shared elements across all languages. In a sense, there are certain constructs that are used to describe algorithms, just as things like past and present tense exist across all natural languages. By focusing on problem solving and forcing students into more challenging problems that require going beyond basic tool usage, we get to the logic elements that persist across time even as tools and technology change under them.

So how do we make this happen in the curriculum? For me, the main point is that the majority of students do this in a course in a department other than Computer Science. The key is that the computation should have purpose and context. There should be problems associated with a particular subject or line of study. When students take a course in CSCI at Trinity, we can throw some problems at then and give it some context, but everyone in the room has different interests and in the Computer Science department, we are often interested in the nature of computation itself more than the problems it can be used to solve. (This is much like the difference between pure and applied mathematics. Almost no one outside of mathematics cares about pure math until some application is found for it to help them solve a problem.)

So these would be courses taught in other departments and to get approval for satisfying this capacity, they would have to demonstrate how solving problems through technology fits into the syllabus. Some of these certainly exist. Certainly CSCI courses would qualify, but I think there are probably quite a few others around campus in departments like Communications and Urban Studies as well as some upper level STEM which also do this without modification. More will be needed though. I think many of these courses could be easily created from existing courses that have been enhanced with assignments/projects that wouldn't have been possible without the technology addition. For existing courses that already use technology for problem solving, they could work with their current hour allotments. For courses that need this added on, I would not want to see the computing elements cut into their normal content. Instead, I would rather see an extra hour added for that purpose. That extra hour would include the time where students learn how to use the technology for problem solving as well as where they will find whatever information (such as data sets) to use in the process. So a lot of the courses that satisfy this would go up to being 4-hour courses under the current system. It might also be possible to have the technology be an add-on lab that only some students would take. That might not work well in many cases, but allowing it as an option would be very helpful for those situations where it does work.

The situation where additional problems are added to a class that involves using technology to solve them is where resources really come into play with this proposal. If Trinity can't actually enact and sustain a proposal, then it doesn't matter whether or not it is any good. Clearly, the courses that already satisfy the requirement require no new resources. However, that will likely be a small fraction of the total. Most of the seats for this requirement would need to come from courses that are augmented with computation and the faculty teaching those might well need some assistance to do that.

How many seats are needed? I personally think that students would benefit most from having to take 2-3 courses that fulfill this requirement. The hope is that students would see slightly different approaches. That helps them to abstract the ideas and see how to apply them more broadly. For every course that is required, we need ~650 seats/year. Courses are typically 20-30 students so it is reasonable to say that we need about 30 sections of these courses each year for every course that is required. That means anywhere from 15-45 courses/semester to have 1-3 of these in the graduation requirements.

Is this doable? I think so and I will go into detail below. First though, I can already see some people objecting that there is no reason there should be 3 computing/technology courses required. However, I would remind anyone making that objection that these aren't computing/technology courses. These are courses in subjects from around the campus which include a significant assignment or project which highlights using technology to solve a problem in that field. There are over 20 departments on campus so even requiring three of these courses for graduation only implies that each department offer ~2 such courses per semester.

(If my department chair sees this blog he should probably stop reading at this point.)

Where things get harder when it comes to resources is the fact that not all faculty will feel comfortable putting this type of content into their class. Even faculty who might be able to find great data sets online, and who want to have their students process/mine those data sets for interesting information might not feel comfortable with the responsibility of giving the students the capability to do that type of technology based problem solving. I don't think they should have to do it alone. I can't volunteer CLT to help with this because they have other tasks. However, the CS department and the instructors housed in it who currently teach IT skills could likely provide some support for this.

Currently the CS department teaches ~3 sections of CSCI 1311 each semester and the IT Skills instructors in the department teach ~7 sections of 1300. That is 30 contact hours per semester currently devoted to the CC. Some of those sections would probably be kept untouched, but in a very real sense that is enough human time to assist with one hour of credit for up to 30 courses each semester. In addition, early efforts to do things like prepare video lectures that cover this material could make it possible to get students up to speed with the skills that they need to solve the problems in question with less direct involvement from faculty in that aspect of the course.

In summary, the reality of the modern world is that computers run everything and students need to have some knowledge about how the software that runs those computers works. They also need to know how to make the computers solve problems for them in situations where the computer can do it better than a human. This should be done in the context of topics the students are studying for other reasons, not just because we are twisting their arms to code. We have the resources to make this happen. It just takes a little will in the part of the faculty. The result will be much stronger students who are more ready to enter the world of 2022.

Wednesday, September 5, 2012

A Curriculum for 1992

(Update: The faculty met to discuss this on Friday afternoon, 9/7, and the committee said that they did feel this was a hole in the curriculum and they needed more time with the proposal to fix it. I will keep my fingers crossed. My next post will basically elaborate on my proposal and the staffing requirements for it.)

Trinity has been working on revising the curriculum for nearly a year now, and today a formal draft of that curriculum was sent out to the faculty. As you can tell from the subject of this post, I am not impressed. I'm writing this post not only to express my ideas, and inevitably some frustration, but hopefully to motivate alumni to take a little action in regards to this. Keep reading for details.

Computers Are Just a Fad
At least that is the impression I get from the curriculum document. The charge for the revisions was to create a curriculum for the 21st century. However, the only mention of technology in the entire document comes in the FAQ at the end. Here is what they say:

2. If we are trying to educate students for the 21st century, why isn’t technological and information literacy part of the capacities?
Answer for technological literacy: 
Our committee agrees that the ability to use, understand, and criticize technology is of tremendous importance. Technological advances flow into the classroom as they become relevant to educational content and delivery, and we are confident that Trinity faculty bring these technologies (as well as a thoughtful discussion about their strengths and limitations) into their courses.
Answer for information literacy:
Information literacy is a hallmark of a Trinity education through the university’s commitment to the QEP. It was felt that most, if not all, of our classes support and reinforce information literacy.
At least they see this as a weakness of their proposal, and they acknowledge the importance of technology. However, they seem to think that faculty will somehow magically start incorporating this into the classroom and that students are certain to take courses that use the needed technology. The reality is that college faculty, to a large extent, as some of the least technologically savvy people on the planet. What is more, I frequently see students who work to avoid technology in the same way that they avoid math and science. This is a bad decision on their part, and most will realize it later in life. Part of why students pay to go to college is so that other people can give them direction and help them avoid making those bad decisions. In my opinion, as this curriculum currently stands, it fails miserably in this area.

Globalization: The Game Changer of the Last Several Decades
So what does this curriculum do instead? There are changes in it. One of the big ones is a push to address globalization. This includes a "capacity" with courses on "Global Awareness", "Understanding Diversity", and "Foreign Language". This is on top of the standard elements where you have to be able to read, write, and speak as well as a smattering of courses from humanities, social sciences, natural sciences, and math. The new part is dealing with being an "engaged citizen", which seems to be largely motivated by a desire to have Trinity students prepared for globalization.

In my opinion, globalization is yesterday's news. I made the subject of this refer to 1992 because honestly, a really forward looking curriculum would have included globalization back then. Now this is just a knee-jerk reaction to a boat that was missed two decades ago. Globalization was perhaps the biggest influence on our economy and the general evolution of the world over the few decades up to 2010. However, it isn't what is going to shape the future. Yes, global exchange of information is going to continue to be important, but production of goods is on the verge of heading back the other way. New approaches to digital manufacturing, including 3-D printing and increase automation, are making it possible to put the production of goods back close to the point of consumption. After all, why should we ship materials to China, have them assembled there, and ship them back if they can be assembled here? For the past few decades the answer was that assembling them here cost too much. However, today even Chinese companies like Foxconn are planning to replace their human workers with robots. Those robots aren't any cheaper to run in China than they are here. However, energy costs for transportation are only going up. So at a certain point (which I expect is <10 years from now) you cross a point where you want to put the robots close to the end consumer and make things as close as possible to where they will go in the end.

In addition, technology is significantly minimizing the need to actually speak a foreign language to have a dialog with someone who doesn't speak your language. Today Google Translate can allow me to have a reasonably fluid conversation with someone who speaks a different language, and the quality of translation and speech understanding is improving by leaps and bounds. If you have seen comparisons between Google Now! and Siri, you can see what one extra year of development means in this space. I fully expect that by 2022 I will be able to speak to someone in almost any language in a manner that is very close to natural without knowing that language. This isn't to say that there aren't cognitive benefits to learning a foreign, natural language. It is just to say that interpersonal communication is going to cease to be one of those benefits.

If Not Globalization, Then What?
So what do I think is the game changer of the coming decades? What should our new curriculum aim for? The paragraphs above should make this fairly clear. Globalization is going to take a back seat to the oncoming surge of digital technologies that will be enabled by machine learning based AIs and automation. It is impossible to predict exactly what will be relevant, but based on what is already out there you can feel pretty confident that in 2022 most students will have cars that drive themselves and many of them will have robots at home that cook and clean. (Sound Sci-Fi? Then you need to follow me on Google+ or at least go search for videos of those things on YouTube because they are feasible today and will be cheap enough to be wide spread in a decade.)

There are other things that are becoming increasingly significant as well. The buzzword of "big data" is everywhere for a reason. In addition, the rollout of IPv6 wasn't much hyped, but there are rumblings of the beginning of the internet of things if you look in the right places to hear them. When your shirt has an IP address and is constantly sending information about your temperature and heart rate into the cloud for analysis, then you will begin to understand what these things are. They are primed to change the way we live in dramatic ways.

What does this mean for the curriculum? My take is that if a graduate of 2022 looks at a computer and seeing a magic black box with pretty pictures on it, that graduate has already lost at life. They are a powerless consumer with no ability to produce in the markets that will define their time. If we let them become that, we have failed the trust that they put in us when they enroll in our school.

My Proposal
So what do we do about this? What do I think the curriculum document should have included? First, let me tell you what it should not have included. It should not require that every student take a CS course specifically aimed at teaching students to program. That would be a nightmare for me on many different levels. In addition, it wouldn't really benefit the students. Some students need to know how to really code. Those students can learn about programming language fundamentals without associated context. For the vast majority of students though, they need to learn how to use computers to solve problems with at least slightly more competence than just using pre-written software.

Increasingly, data is what drives the world. Humans are horrible at manipulating even reasonable amounts of data. Computers are great at it. The graduate of 2022 should have seen the data associated with courses in a number of different departments and they should have had to do something beyond just plugging it into existing software to dig for meaning or answer questions based on that data. They need to have some experience using a computer and associated technologies to solve problems. That is what really matters. They need the skills to turn the computer into a tool that they can use to solve problems that are beyond what they can do alone.

I believe the best way to do this is to require that students take a few courses that require them to do computer based problem solving. The ideal situation would be that courses that normally count for 3 hours in departments all across the University add an extra hour of credit and a computational problem solving component. For example, a course in Political Science could ask students to analyze census data or data from the last presidential election. Have the students answer questions that aren't simple queries in Excel. That way they might learn how to write VB Script and do a little logic to solve the problems. Or maybe the questions you want to answer are well suited to some reasonable SQL queries. Sometimes the right approach might be writing scripts in Python, Perl, and Scala. The details don't matter to me. The details should be chosen to fit the data set and the questions being asked about it. What matters is that students learn how to make technology do what they want it to instead of acting as passive consumers of software that someone else has written.

I've always liked the expression that if your only tool is a hammer, every problem looks like a nail. All too often, I see people do things the hard way because they don't know there is an easy way. Even worse, I see people who can't even conceive of certain questions because the tools they know don't allow them to answer those questions. If our graduates fall into either of those categories, we have failed them. I don't want to see that happen. A major curricular review is the time to make sure we do things right. Unfortunately, I don't think the current proposal is doing that.

Call to Action
So I want to close with a little call to action for any Trinity alumni out there who are reading this. If having the ability to control technology and make it do what you want to solve problems has benefited you in life, let your old faculty know. Take a minute to tell them how skills like programming have benefited you in your life and what things you wouldn't be able to do without those skills. You might even just pass on a link to this if you don't want to write much yourself. In addition, forward this to other alumni so that they too might help the faculty at Trinity to see that computers are not a fad that is going away and that being able to bend and manipulate technology to your benefit to solve problems really is a valuable skill that everyone needs to have.

Tuesday, June 5, 2012

Color of clothing to wear during the summer

This post is a bit out of the norm for my blog, but a while back io9 posted and article on why you should wear black during the summer to keep cool. They linked to another article from The Straight Dope. These have been bugging me for a while. You can go out and do the experiment yourself and see if you feel hotter in white or black under the summer sun. I strongly expect the white will win. So why is the "physics" explanation saying otherwise?  My take is that it is because the physics they present has been oversimplified to the point of being wrong. They also look at a scientific study that has no bearing on clothing, unless you happen to wear clothing made of bird feathers.

Their argument misses one extremely critical point, your body does not emit thermal radiation in the same part of the spectrum as the Sun's primary energy emission or the color you see in (and yes, black-body radiation is a perfectly valid term for this despite what The Straight Dope says) . When you describe a shirt as white or black, you are talking about the reflectivity in the visible part of the spectrum. That is only a narrow part of the spectrum, but it happens to be where peak emission is for our Sun. For that reason, the color matters critically when it comes to how much solar energy is deposited in your clothing. A black shirt will absorb a lot more sunlight than a white one. That absorbed sunlight will be thermalized, increasing the temperature of the shirt.

Where their argument goes astray is when they start talking about the radiation from your body that tries to cool it. All object radiate energy away based upon their temperature. The term black-body radiation is often used because if you had a perfectly black object, which absorbed all incoming light, reflecting nothing, this is what you would see. It just happens that stars are pretty darn good black-bodies and their spectra fits the expected shape perfectly except where elements high in their atmosphere have absorption bands.

You, as a human, have a temperature much lower than a star. This means that you radiate a lot less energy and that it is at much longer wavelengths. The peak wavelength for your emission is roughly 10-5 m. For the Sun it is closer to 5*10-7 m. You are seeing your shirt and calling it white or black based on the way it interacts with light in that shorter wavelength. However, when it comes to cooling, the radiative component depends on the longer wavelength. To really know what that means, you need to have a far IR spectrometer and look at the absorption spectra of your shirt using that. I haven't actually done this, and if anyone has, please comment and correct me if I make a mistake here, but I have a feeling that light t-shirts are going to be mostly transparent and even if they aren't, their "color" in the far IR will have virtually no correlation to their color in the visible. The end result is that I expect the thermal radiation cooling your body doesn't depend at all upon the color you see your shirt to be. It depends a lot more on thickness, materials, and style of fabric.

As the article says, if there is any wind, then the dominant heat transfer mechanism will be the air moving heat away. That process too is independent of color. Perhaps the wind will take more heat from a black shirt, but only because the black shirt was hotter to start with so it had more thermal energy to give up. The white will still be cooler in the end.

My conclusion, wear white during the summer, and if you really are worried about whether your shirts are making you too hot, send them to a lab for some far IR spectroscopy to see which materials really let your body heat back out.

Monday, May 21, 2012

Certifying Knowledge: A Business Idea

I have written previously about things that I feel need to happen in education reform and how there could be a dramatic change in higher education that some refer to as the "education bubble". There is one big thing that I see being needed to really trigger that disruption. It is something that is clearly doable with current technology, but it hasn't been put into place yet. I feel like someone who wrote this and put it up quickly could make a lot of money. The piece that is missing is certification of knowledge with validation of the person who has the knowledge.

There are lots of sites out there now which measure various capabilities/skills. TopCoder is one of the first that jumps to my mind. They have a number of different competition based metrics to indicate how fluent a programmer is at different tasks. With all the MOOCs that are coming on line and their automatic evaluation, the ranks of these are growing quickly. However, one thing to note about the biggest MOOCs, like Cousera, is that they have a somewhat limited form of recognition of completion. Some of this is because the schools posting courses there need to still have a business model. Probably more of it though is because certifying a person would require that you really know who they are.

Universities go through a fair bit of work to verify a person's identity. At small schools that is implicit in the overhead of having small classrooms. I know all my students by name on sight. If someone else were to show up on a test day I would know something was wrong. Larger schools have other mechanisms to make certain that the person demonstrating knowledge is the same person who is getting the grade and the credit. This doesn't happen online yet, but it can, it needs to, and inevitably it will. It is just a matter of who will build the system. The rest of this blog outlines my idea for how this would be built and how it could go into a useful product.

Monitoring the Tester
The key, in my opinion, is the ubiquitous spread of webcams. Having things like biometric sensors can identify that the person is present, but a webcam and microphone makes it possible to further show that person is the one doing the work which is being evaluated. The webcam can do the same type of job as other biometrics to make certain that the person you want is the one at the computer. Facial recognition, voice recognition, potentially even information on the person's iris if they get close enough or the cam has enough resolution. In fact, the more forms of identity you have, the better it is. The site should mark each test/result with which verification techniques were used.

After the initial verification, the webcam and microphone can be used to monitor the person taking the test. Eye tracking can see where he/she is looking on the screen. Smaller body movements can show that the person seen in the image is actually the one clicking on certain answers at certain times or doing the typing. The microphone can monitor what is being said in the room to make certain that the person is not being told the answers by someone else in the room. One might even consider having 360 degree webcams to monitor the whole room.

The goal here is to have a system that can be run on a large fraction of the devices someone might want to use to do work that demonstrates their knowledge of a topic, and for that system to be roughly as accurate at preventing outside assistance as a human would be. Of course, it needs to be automated so that it can scale well. You can't be paying hundreds of people to watch other people taking their tests through a webcam. That loses the efficiency.

Certification Clearing House
To make this more valuable, you have to put it together with many forms of verification of knowledge. Many things will be done with fairly simple tests, but the ideal would be that you could have a site that not only does the verification, but which can tie in with other sites so that you might be able to do something like verify the identity of a programmer competing for TopCoder without having to rewrite TopCoder.

It would also be nice if employers could submit their own types of tests. So this could go beyond basic academic areas. The goal would be to have a significant database of areas of competency/mastery and the ability to link up employers and potential employees. In an ideal world, you would eventually use the type of oral exam method I described earlier that would use a Watson style area expert to run the tests. In addition, it could be tied in with educations tools to help teachers/learning coaches to see where people are having problems to get them over things. This type of system could literally be used for demonstrating the capabilities of people going from low grade levels up through graduate level study.

Business Model and Added Benefits
If you were to create a company on this model, it might be possible to collect money from both sides. There is clear value to individuals to have a certification of what they know and what level of mastery they have demonstrated in that knowledge. In addition, there is benefit to entities that want to verify that someone knows something.

One of the interesting perks that I see in this type of approach is that employers could actually build a description of skills for various positions. Some could be hard limits. Some might be desired attributes. Still others might be things where you need to have demonstrated a combined ability in several areas with the exact distribution being fairly insignificant. HR in this world gets a lot easier on the hiring end, and probably gets downsized a bit because there isn't nearly as much use for resumes.

In addition, if you want a job, you should be able to see the metrics in some form to see whether you qualify. I see this as a great motivator for kids in school. They could actually see what types of skills are valued for specific types and levels of jobs. They could see where there are going to be ceilings that come into play because they lack depth of a certain type of knowledge.

This goes back the other way too. Employers can do an analysis of real performance of employees to their measured skill sets and refine what types of skills really lead to better performance in a particular job. As long as enough of the information is open, this winds up being a generative system. People will find all types of new uses for the data and new meaning that can be pulled out of it.

Wednesday, May 9, 2012

Sociology/Psychology of the Superstar Effect

The following thought hit me on the way in to work. What are the full sociological and psychological implications of having a superstar effect across many industries? This is right in line with the full theme of technological unemployment. You get technology enabling a few people to serve the needs of vast segments of the population. The result is that you can have an entire market completely saturated by a few individuals and that doesn't lead to all that many jobs being created. The most visible example of this today is Amazon servicing a huge fraction of the purchases that happen in the US and with the possibility of extending it.

None of that is new thinking. It is stuff I have written about earlier, and which I post about a lot on Google+. The new thought was something that hit me as I was driving past the local high school. How does all of this potentially impact the thinking of youth and from there spread out into society? Perhaps right now the answer is that it doesn't. After all, kids are all often blissfully unaware due to their youth. However, I don't fully believe that, and even more, I really don't expect that to be true in a few years. Teenagers don't see social change, they are social change. They don't realize that ideas they are internalizing might be things that were revolutionary to those before them. They just believe them because they are logical at the time in which their view of the world is being largely formed.

So what psychological impact does it have if you internalize the idea that whatever you decide to do with your life, there will wind up being a few superstars and everyone else is pretty much doomed to failure? Here I mean failure in the sense that you probably can't make much of a living off of it. I know that thought does not sit well with me. I know well enough that while I am good at some things, even great perhaps, I am not world-class. Of course, I am well removed from my teenage years. Anyone who knew me then knows I had a pretty high opinion of my self, and I truly expected to be world-class in some things, like math and physics.

There is also another area where the superstar effect has been in place for decades, yet it doesn't seem to phase too many kids. That is the field of sports. High dollar professional athletes make a lot of money. However, as an example, the NBA only has 360 active players at one time. That's all the big money. D-league players can make a living, but even then we are talking about ~1000 people pulled from around the world with an emphasis on the US. What fraction of kids who play High School sports will go pro? I don't know the exact figure, but needless to say, it is very small. This doesn't prevent kids from playing sports. However, I have a feeling that most are playing for the fun of it, not because they honestly expect that they will become professional athletes. Only the top 1-2 players in a school can convince themselves they will be pros. Who knows, maybe the possibility of being a superstar pushes others to keep working harder.

That's fine for sports, which kids normally do for fun. What about in everything else though? How will a kid approach a math class if he/she feels the odds of "making it" in math were the same as making a professional sports team? You can ask the same question for science or English. Does having that thought in your head drive you to work harder to make sure you are the superstar, or does it eventually defeat you when you realize it simply isn't going to happen?

I am afraid the answer is the latter option. I am also afraid that this idea will eventually sink into the heads of young people as automation of both routine, and not so routine tasks increases. If our social structure is the same then as it is today, I think that will be a significant problem. As soon as they realize that they aren't going to be superstars, they give up because only the superstars survive and everyone else gets stomped down.

Tuesday, May 8, 2012

Klout and Occupation

Basics of Klout
I've been meaning to write a post on Klout for a while. Jon Perry wrote a very nice blog post at The Decline of Scarcity on the topic after Wired wrote an article about Klout. The basic idea of Klout is that the provide a metric for ones influence in social media. Some people love the idea while others hate it. The Wired article shows that however you might feel about it, some places are definitely starting to use it. I also feel that Jon Perry did a great job of describing why Klout, or something like it, could very well become a significant metric in the future. In a world where material goods are abundant, the attention of people will be one of the few scarce resources. That scarcity will give it value. If an individual can command a fair bit of attention from others, that implicitly will give him/her a certain value.

Of course, we don't live in a world yet where physical goods are abundant. That doesn't mean that commanding attention on social media is without value. Indeed, if you put yourself in the shoes of a marketer, there is tremendous value in knowing who different people listen to on different topics. Instead of pitching to everyone, you can focus on the people who can complete the pitch for you and get others to listen. The companies that are giving Klout perks today are the ones who realize this and who want to see what value they can extract from it.

Your Job Today and your Klout
An idea that I haven't seen discussed in other locations, but which weighs heavily on my mind is how occupation impacts Klout. This idea hit me because of a comment I saw on an earlier discussion of Klout. The commentor said that he was too busy doing things to spend the time posting on social networks that might gain him Klout. This strikes me as a very valid comment. There are certain jobs which will not lend themselves to a high Klout. Indeed, that is probably true for most jobs. People in those jobs can still achieve a high Klout, but they must do it on their own time.

I actually spend a fair bit of time on social media. I also spend a lot of time working to keep up on current technology. In my mind the two are both related to my job and to one another. That vast majority of people I interact with through social media are current or former students. As my job is to help prepare students to enter the world in ~4 years, I feel I need to have some idea of what is coming down the pipeline for that time. Much of my activity on the social media is in the form of sharing thoughts and interesting articles with those current and former students.

In other words, there are aspects of my job description which naturally lead to me having an increased Klout score. I can think of other occupations where that would be even more true. People in PR and marketing could virtually live in social media these days, experimenting with new approaches to reaching out to people. On the other hand, I can see most of my students going into jobs where they will have a low Klout score unless they go out of their way to boost it. Their jobs will focus on producing code, not distributing knowledge on how code is produced.

What do you think? Is this link between occupation and likely Klout score a fundamental problem with Klout or does it mean that Klout really does measure what it is supposed to?

Friday, May 4, 2012

Support the "Dr. Mark C. Lewis Closet of Tears"

First, some background. Trinity is building a new science facility. It is called the "Center for Science and Innovation", CSI. You can watch the construction online. This is a huge upgrade to the campus, and the new building will house Chemistry, Biology, Engineering, Psychology, Neuroscience, Entrepreneurship, and most importantly, Computer Science.

On Thursday, 5/3, I went to a meeting up in the President's office with a bunch of people who were much better dressed than I was. At this meeting faculty representatives from the different departments described how the new building is going to make a huge positive impact on how they work. The really well dressed people need that information so they can try to raise funds for the building.

From Computer Science, the biggest benefit of this move is just that we get pulled into the center of campus, and will be far more visible than in the past. We will also have beautiful rooms and student spaces. Every one of our classrooms has a wall that is either completely or nearly completely glass. One views the main walkway through the building and the other two have a perfect view out over the center of campus. (Note the big glass section at the far left in the figure above.) There are student areas right outside the row of faculty offices as well as further away for those who want to avoid the faculty. One of those overlooks a large studio area that will be primarily used as an Engineering design space, but which can also be cleared out for me to roller skate through or perhaps for other purposes.

After this meeting I had some final review sessions and I talked to students briefly about the meeting. I mentioned that three long-time faculty of the CS department are retiring this year and how it would be wonderful to have those classroom spaces dedicated to them. My wonderful students, being as caring and considerate as they are, had another suggestion. They said that in honor of my efforts over the brief 11 years I have been at Trinity, I should have a closet dedicated to me.

I was so flattered I had to run with the idea. My inspiration came from an anonymous student who wrote the following on a course evaluation last fall: "Oh dear god. Countless hours of my life spent curled up weeping on the floor ..." To honor this student and so many others like him/her, I felt it was only appropriate to write this blog post asking you to donate money to support the "Dr. Mark C. Lewis Closet of Tears". I would like to see this set aside as a private space where students can go when they have reached their wits end, and have given up all hope of stuffing more information into their brains. Or maybe for those student who are tired of banging their heads against walls searching for solutions to problems they think I picked just because they are unsolvable.

Your donation can not only support the construction of this valuable space. It can also help to pay for appropriate padded materials for the walls, floor, and ceiling. We want to make sure that future students can get out their frustrations and anxiety in a safe, supportive place.

So while you are writing out those big checks for the "Dr. Maurice Eggen Teaching Lab", the "Dr. Gerald Pitts Teaching Lab", or the "Dr. John Howland Teaching Lab" to honor their many decades of service to Trinity and their personal impact on your own education, send a note to Rick Roberts in development at letting him know you want some fraction of that to go to the "Dr. Mark C. Lewis Closet of Tears". It is only appropriate that you should remember the children. Don't make them weep on the hard concrete hallways. Give them a safe place to bang their heads.

Disclaimer: I sincerely hope that anyone who read this far realized this is largely satirical and is actually a request to support the construction of CSI, and hopefully to acknowledge the contributions of Drs. Howland, Pitts, and Eggen. I honestly don't know how tour guides would explain a plaque that includes the text "Closet of Tears". You really should contact Rick Roberts or others in the Development office about plans that are in place and how you might support them.