You are browsing the archive for Blog.

by Greg

Do we expect too much from graduates?

17:00 in Blog by Greg

…education is preparation for appointments not yet made

- Howard Swearer

(Brown University President 1977 – 1988)

How much can a young brain really absorb in the three or four (in the case of a placement option) years of an undergraduate degree programme? It is a question I ask myself more and more every year and is no doubt amplified by the current economic backdrop. The Creative Technologies (CT) programme is by definition a multi- and inter-disciplinary course of study and so a range of diverse but (we feel) related topics are discussed. Our goal ultimately is to provide the framework whereby students can explore, think and grow; graduating as well-rounded, culturally informed, creative citizens embodying: effective communications, discipline in their work, critical and analytical thought, high-level technical skills and business awareness. It is a complex recipe aimed at enabling the long-term success of our graduates which must be a leading requisite, right?

Why this breadth of study? Would it not be easier to specialise in particular fields and thereby keep everything, well, simpler? Yes, it would certainly be easier to rationalise the course content down to a smaller number of specific areas but this does not reflect the world outside. Firstly, by looking beyond specialised fields we develop different but complementary skills and as a result the mind grows and new possibilities for creatively combining and applying new knowledge and skills are discovered. The information silos of previous generations are just not something we can return to. Secondly, this approach protects students from possible professional obsolescence due to a change in the economic or business landscape where their specialist field may unfortunately be no longer in demand; evidence of which has been all too abundant in recent years. Lastly, students will sometimes require a little assistance in “finding their groove” – what area/s excite them most, where their talent/s lie, what kind of opportunities exist and some future gazing by way of assessing the long-term prospects of a particular pathway. On numerous occasions we have (very happily) observed students join us with one set of interests and graduate with a greatly expanded set.

Breadth however presents a not so insignificant challenge; in order to enable a greater breadth of study within the context of a fixed period of study, the lesser the depth of enquiry in some of the areas. In short, the broader the range of subjects, the less time you have to go deep on specific subjects. The application of knowledge to projects and real-world situations, as we all know, takes time. Experience takes time.

Education at higher level is not simply a set of required courses and exam results. It is the summation of all three or four years curricular achievements, personal development (much of it outside the institution), experience studying abroad or on placement and the wealth of experience that students gather while working together in labs, studios, project groups and in performance related areas e.g. music, drama, games, sports etc. A well educated individual understands that learning is lifelong.

When we hear from business leaders and their requirements from graduates the agenda will quickly shift towards skills, or more correctly, the (often specialised and experienced) skills they need today. I suspect few will argue that, in the main, education globally has in recent years underestimated the extent of change brought about by our increasingly technology-enabled world; and the same could also be stated for industry. Many were caught off-guard and success (for both the academy and business) comes down to an ability to adapt, and do so quickly. Integrating new and old experience through a constant cycle of review, refresh, refine or reject. Business and the academy do not exist in mutually exclusive worlds and it is imperative (for all our sakes) that we find a way to effectively amalgamate the goals of work readiness and long-term career success.

So, what can they absorb? As it turns out, given the right attitude, and every possible support from the institution, students can absorb quite a lot. What we haven’t yet found however is a means to hack experience. As already mentioned, this requires time. Post-graduate (applied) study and – as proposed by our own Professor Paul Moore – professional apprenticeships would appear to be a logical means to enable the deepening of skills through the experience of more projects; something we can all accept as essential.

by Greg

Why we banned Facebook

18:36 in Blog by Greg

Let’s first build some context around the title of this post; though the verb ban is used, we haven’t actually formally decreed that the use of Facebook is prohibited amongst our students – this would not be appropriate or possible amongst a group of adults – nor have we deployed network tools to restrict access to Facebook across School provided facilities. We have however made a strong case to all our students (assembled for a recent all-programme meeting) that the use of Facebook, including Facebook Groups, for course related activity is not supported by the course team. Why would we make such a decision and how do our students feel about it?

There are two main reasons for this decision:

Facebook is a company

Of all the social platforms Facebook has one unique qualifier, its size. With over 1 billion users worldwide (33 million in the UK) Facebook has unparalleled scale. Looking at it another way, approximately 1 in 2 of the UK population are on Facebook. I’m sure we can therefore agree that its reach into our society is unique. Consequently, Facebook has ceased to be viewed as a company – which it most definitely is – and more as a benign public utility – which it is not. A public utility would by definition be bound by a degree of governmental or community oversight. The same is not true for a company where the primary objective is to create value for itself and its shareholders – this value can also benefit the user base and I do not wish to imply a specific problem with the model, just that it is quite different from that of public utility.

On highlighting Facebook’s data brokerage activities one keen student asked how is Facebook allowed to do this? The question is most revealing and brings us to a key point, Facebook’s activities are governed by its (ever changing) Terms and Data Use Policy which all users agree to on signing up.

The combined terms of service and privacy policy – currently sitting at 14,000 words – state extremely broad operational norms and grant rights over user content. Of even more concern is the manner in which the policies are updated with seemingly little regard for the existing users of the service. As educators it is our responsibility and charge to fully inform students on the implications of clicking Sign Up.

Freedom to make mistakes

First and foremost amongst our objectives as a course is the provision of a safe environment for all our students. This environment must include the ability to make mistakes; reflecting on mistakes is after all an extremely important part of the learning process. In most cases, privacy settings on Facebook default to public which effectively means that content can be viewed across the entire platform, by anyone. Astute users are typically more diligent in the configuration of privacy settings however these users appear to be relatively small in number. Facebook’s privacy policy has resulted in a normalisation of public status, an emergent characteristic which most of us will find counterintuitive. Taking all of this into consideration combined with questions over the period of time for which Facebook retains deleted user data - removed content may persist in backup copies for a reasonable period of time* – we (the CT team) have to conclude that Facebook is in direct contravention of our safe environment guiding principle and that it is not appropriate to leverage the service in the context of education, either officially or unofficially.

Communication and effective tools to enable sharing and collaboration are of fundamental importance to both staff and students of the programme. It is for this reason that we have operated our own bespoke communications and collaboration platform since 2006 – we call it CTNet. CTNet has several social-like features and it is private to all but CT staff and students. Data is handled according to strict guidelines and when something is deleted, it really is deleted.

It was encouraging to find that the majority of our students were to some extent aware of the wider reality and further discussion revolved around online privacy and the implications of using other popular services like Google Search, Gmail and Twitter. While terms of service will apply to all online services, each service needs to be assessed individually.

* Extract from Facebook’s Statement of Rights and Responsibilities

by Greg

Hello Word! Processing

11:49 in Blog by Greg

Hello World! Processing is the first in a series of documentaries on creative coding, ideas, form and play. The film includes interviews with leading artists, designers and developers including Ben Fry, Casey Reas, Aaron KoblinMarius Watz, Robert Hodgin, Daniel Shiffman, Jer Thorp, Karsten Schmidt and Tom Carden. The series will also look at openFrameworks and Pure Data.

More details here

by Greg

Liberating Software

20:20 in Blog by Greg

…computer science is a liberal art, it’s something everyone should know how to use, at least, and harness in their life. It’s not something that should be relegated to 5 percent of the population over in the corner

Steve Jobs  (1955 – 2011)

We react to software the same way we react to movies and music. The language of our lives

Andy Ihnatko

I should perhaps clarify the title of this post a little further; it refers to the liberation of software or more specifically software development. OK, but liberate from what? Liberation from the countless labels, preconceived notions and misconceptions relating to how really great software is made. If we accept that software applications increasingly perform as vantage points into our world, we inevitably realise the emerging synthesis of software and culture, technology and art.

The evolution of our everyday devices experienced through a multiplicity of screens, year-on-year increases in processing capacity, reduction in physical size and cost coupled with improvement in network connectivity (although slow by comparison) has brought us to an extraordinary juncture and as we discover new and exciting ways to interact with information, design and development must be centred around human experience.

And herein lies part of the problem, I’ve already used two labels – design and development – in reference to how software is built and these in turn can imply that the two are distinct from each other. Design and development by and large have evolved within the traditionally distinct spheres of art and computer science. The reasons for this are not complicated and have been discussed in previous posts; the key point today is that this separation can no longer be supported. Thinking in terms of separate activities will only lead to failure as ideas merge unsuccessfully.

I recently had the opportunity to attend the inaugural Úll conference in Dublin where Kyle Neath, Director of Design at GitHub, addressed the topic of design versus development stating that “labels are frustrating and lead to arbitrary conflict.” Interestingly, while Kyle’s professional title suggests design, he describes himself as a builder, holding skills in both design and development. Whether in education or professional work how can we expect to fully realise our potential if we insist on classifying talents, abilities, skills and duties in strict verticals?

So, what about the realities of software development? Not that long ago it was necessary to spend a significant amount of time just negotiating the complexities of pushing an application live. Today however, thanks to the work of many extremely clever people, excellent tools and frameworks exist; these tools and frameworks enable the management of complexity which in turn grants us the opportunity to invest more time in ideation and build. I believe it is unfair to charge computer science departments with sole responsibility for informing future builders in the use of these tools and frameworks. The next set of problems facing the human network are very much harder and it is here that computer science should be allowed to concentrate – investigating future tools, frameworks, solutions and possibilities. More importantly, it is now within the reach of a far greater proportion of the population to start dreaming up and realising new software applications.

The acquisition of the popular mobile photo-sharing service Instagram by Facebook for a staggering 1 billion dollars has been a topic of much discussion in recent weeks. Instagram CEO and co-founder Kevin Systrom is not strictly an engineer; he had previously worked in marketing and taught himself how to code in his spare time. The meteoric success of Instagram (first 30 million users in less than two years) is attributable to a variety of factors; the leading contributor however is that the application did a simple thing really, really well. Not that it was a new idea – sharing photos with friends is hardly new – just that the execution was at all times user-focused and of a very high standard. You only had to use the application once to realise that great care and attention had been invested in the user experience. Ideas need the support of passion and it is not the case that innovation is restricted to experts within specific domains.

What if software development in the context of interactive systems simply requires a new name? A name which captures more fully the breadth of relevant subject areas? This is not to suggest a dumbing down of the endeavour, not at all! Only that we evoke a more rounded approach and avoid unnecessarily alienating future builders through the use of predominantly engineering based naming conventions and examples. Education clearly has a significant role to play and an urgent reconstitution of the art science balance is necessary; with particular attention on the early school years. It is vitally important that our children are prepared to explore, create and persevere in technology rich environments; not to function simply as users of  software but to maximise their potential as builders of new systems. Engineering and art are nothing more than manifestations of human creativity and therefore share much in common.

In closing I feel I should highlight that the above is not intended as an argument in favour of generalist skill sets; I have indicated in previous posts that excellence requires focus; whatever the subject area we simply must have depth. We accept (rapid) change as a reality and therefore knowing everything is unrealistic, however, in addition to our key strengths, we should each be versatile enough to develop solid understandings of the mechanics of related disciplines and in so doing gain perspective on the bigger picture.

by Greg

Get Making!

00:55 in Blog by Greg

The specialist in comprehensive design is an emerging synthesis of artist, inventor, mechanic, objective economist and evolutionary strategist

Buckminster Fuller (1895-1983)

While there is no magic formula for innovation we know that it stems from an investment in people and a culture of creativity; a culture supporting freedom, collaboration and focus – each an important element in inventing the future. One of the great challenges of our new globalised reality is the delicate balance between depth in understanding – a process which can often take some time – and the necessary rapidity of new developments. Education is clearly key, however we also need to start thinking a little differently; whether we realise it or not we are currently in the midst of an industrial revolution; a revolution where good ideas can come from anywhere, digital bits are both material and currency and where each of us can be creator, doer and maker.

In January 2012 the Creative Technologies (CT) programme commenced a new module in “Interface Technologies and Applications” and though the title may be lengthy, the objective is quite simple; get students making stuff (combining both hard- and software) as early as possible. Utilising a variety of software tools (many of which are open source), a selection of electronic components and the Arduino micro-controller, year one CT students are currently on their way to realising their designs through an initial prototype upon which they will test, learn and iterate.

The new module sits to the fore of an updated three year pathway in what we refer to as Interactive Systems; bridging the physical and the virtual through content rich, user-focused applications of data…from literally any device. We’re confident that placing a focus on this bridge will achieve two things:

1. A vastly improved understanding of and interest in software coding and electronics

2. A boost in innovation as barriers to entry are reduced and rapid prototyping becomes commonplace

by Greg

Managing Attention

18:50 in Blog by Greg

Distracted from distraction by distraction

T.S. Eliot (1920)

In one way or another I’ve been pondering the matter of attention for the best part of two years. Our working/studying environments push ever more distractions and interruptions into the path of our attention with the result that sustained focus is extremely difficult. Finding time and space to think is highly important. As Edward de Bono pointed out “some of the best results come when people stop to think about things that no one else has stopped to think about.” Competition for attention isn’t a new problem; in Brave New World Revisited (1958) Aldous Huxley reminds us of “mans almost infinite appetite for distractions.” However, given the rapid proliferation of the network, the contemporary picture perhaps differs in terms of scale – Clay Shirky suggests that abundance creates information overload and this began with Gutenberg and the printing press (circa 1450). Though we may believe otherwise, our attention is like a spotlight with only the directly illuminated areas of our world arriving at perception’s door step. Tunnel vision is in fact part of our makeup. You can test this for yourself by visiting The Invisible Gorilla website which is based on research by Daniel Simons and Christopher Chabris.

In a knowledge based economy, information is the commodity and it is in abundance. We consume three times the amount of information we did 50 years ago. In 2011 we created and replicated a stunning 1.8 zettabytes (ZB) of data. That’s 1.8 billion terabytes (TB) of data which is roughly equivalent to 200 billion 120 minute high definition movies. By 2020 the amount of data being produced will increase by a factor of 50, driven largely by internet enabled devices. In the UK the internet is now used by 73% of the population with 60% of that number using social networking sites (up from 49% in 2009). In the context of our daily lives, in addition to tracking possibly numerous e-mail accounts, data feeds and social media, there’s also the sense that there’s even more data out there which needs to be tapped as soon as possible. We are highly social creatures; communication with family, friends and confirmation of individuality sit deep within our psyche. Let’s not forget that we follow 150,000 generations (3 million years) of humans who have evolved to live in one world, the physical world. Relatively speaking the web, as we know it today, is a very recent development (circa 1995) and with it came a second, virtual world. Our lives therefore are getting increasingly noisy and research shows that this increasing volume of information can adversely affect both performance and well-being; even causing burn out.

The human sensory system is extraordinary. The eye can detect as few as two photons entering the retina. Our ears are sensitive to changes in air pressure over a staggering range; from just audible to pain represents a difference of over 1 trillion. And yet, despite – or perhaps because of – these abilities, we can find it difficult to navigate with the car radio on and find reasoning a challenging problem almost impossible with the TV on. It appears that we reduce sources of stimuli in order to maximise the amount of attention we can allocate to a specific task. We may believe that our eyes capture everything before them like a video camera and that memories will play back as recorded, however the reality is that you see only a small portion of your environment at any given time. Similarly, we filter sounds continuously by way of identifying what we’re interested in. The flow of sensory information into our consciousness is therefore compressed by attention. Which brings us to the subject of multi-tasking*; given the limitations in processing attention-rich inputs simultaneously, it is a complete myth and actually reduces your overall capacity – studies show that switching between tasks can result in a drop in IQ of up to 15 points. See Clifford Nass and Multi-tasking is Bad for Your Brain

Within any creative field focus is crucial. The ability to direct all your attention onto an important matter is the only way to create things of value. It’s really very simple, deep thought takes time. Fast responses are driven by intuition and heuristics and though we would not be able to go about our daily lives without them, responses from our more deliberate, reasoning system must be provided processing time. In 2009 John Cleese made a presentation (video below) to the Creativity World Forum where he spoke about the origin of ideas, the unconscious self and how it’s important to create “boundaries of space and time.” Quiet time yields greater attentiveness and improved cognition. As long as our immediate attention is absorbed in reacting to new inputs we can never realise our fullest creative capacity.

YouTube Preview Image

Pico Iyer recently wrote in the New York Times that “the central paradox of the machines that have made our lives so much brighter, quicker, longer and healthier is that they cannot teach us how to make the best use of them; the information revolution came without an instruction manual.”

To conclude, I’m not suggesting we all disconnect and ignore the profound possibilities of our combined intelligence. That would be silly. Our pre-web environment was relatively scarce in terms of content and hence time for concentration was more plentiful. We neither can nor should return to that state. What I am suggesting is that discipline (read: focus and endurance) are extremely important within the context of a creative endeavour. Evidence strongly suggests that we are more creative when free from interruption. Technological change invariably inspires new social structures.

Within the Creative Technologies programme, as part of continued efforts to foster as creative an environment as possible, we’re developing a set of working principles aimed at ring-fencing attention and minimising switches between tasks for both staff and students.

* Multi-tasking in this context refers to undertaking simultaneous tasks which are not automatic. For example, we can all walk and think at the same time; this is multi-tasking but walking is largely an automatic process i.e. we’ve done it so many times that it takes little or no additional processing.

by Greg

Digital is dead…long live digital

13:18 in Blog by Greg

Face it – The digital revolution is over

Nicholas Negroponte (1998)

Digital art, digital media, digital content, digital products…it goes on! Though few of us will have failed to notice that a great deal of our world is now transmitted and received as a string of 1s and 0s, in 2011, is the adjective ‘digital’ really relevant? Are we trying to suggest that by their very nature all things digital are the same, or, somehow different to their respective compound noun origins? Surely art is art, media is media, content is content and a product is still a product. The development of really great art/media/content/products involves stepping through a creative pipeline which we find to be remarkably consistent and irrespective of the tools used.

Convergence has forever forged media, data and personal IT to the extent that the content is indistinguishable from the underlying technology and the result is a transformative agent for both the mind and society. We shouldn’t be surprised by this however. Circa 1964 McLuhan pointed out ‘we shape our tools, and afterwards our tools shape us’ and Thoreau, 100 years earlier, suggested ‘we do not ride on the railroad; it rides upon us’. It’s important therefore that we maintain an objective distance* so as not to be consumed by our rapidly changing environment.

Let us now focus on the often cited concept of the digital native. Marc Prensky’s 2001 Digital Natives, Digital Immigrants theory carries with it a number of dangers, particularly in an education context. In short the proposal** is that those born before the rapid proliferation of digital technology are immigrants, and those growing up with the technology are native. It’s clear that in relation to the younger members of our population access to information and communication technology (ICT) has reached an extraordinary level and their malleable brains are adapting to the multitude of interfaces and media layers with greater ease than say their parents. However, we must not take from this that the native generation are fully literate in, or implicitly understand the network. Remember that the primary online activity for our younger generation is social and/or recreational in nature. We find community silos e.g. Facebook in abundance and the notion that the Google Generation understand and explore the web widely is quite simply false. The badge serves no purpose other than to unnecessarily distance a portion of the population from the emerging reality and the very real concern is that by assuming too much we risk creating a significant skills gap in the next generation.

* The reality is that we find a conflict between objectivity and engagement. As artists and developers, if we’re not engaged we’re restricted in our ability to conjure the very best of ourselves and/or the project.

** Marc Prensky has since revised the theory in favour of digital wisdomand yet the original iteration persists???

by Greg

21st Century Skills

20:40 in Blog by Greg

The intuitive mind is a sacred gift and the rational mind is a faithful servant. We have created a society that honours the servant and has forgotten the gift

Albert Einstein

We sit at the intersection between the art brain and the logical brain. One can make bold conceptual leaps with little reason other than instinct or emotion while the other demands process, order and efficient execution. Fusing left brain right brain processing is a task which is far from insignificant and while the relative outputs uniquely compliment each other, it is perhaps one of the most significant challenges facing us today.

As we undergo the consumerisation of IT and move towards ubiquitous computing, art and design are crucial components in humanising technology. Science fiction is now science fact with Roddenberry’s hive mind not such a distant possibility (see video from Ericsson below). There are currently in the region of 5 billion mobile users worldwide with 90% of the global population having access to a 3G network. Mobile internet will surpass the desktop within the next four years and as the separation between work, home, formal and informal is truly blurred, content is king.

Where once the ability to read and write were prerequisite to employment within a media context, an awareness of the audiovisual toolset is now the default condition. Discrete specialist facilities and tools are no more. Content developers are mobile and creativity spontaneous. Data wrangling, asset management and experience of multiple applications and workflows are fundamental. As we then move into the interactive space building strong ideas around a software driven core requires a unique skillset. Experience creators fuse ideas, design and technology. As Forrester’s Mike Gualtieri put it great software talent means renaissance developers who have passion, creativity, discipline, domain knowledge and user empathy. We tend not to see how radical the changes of the last decade or so have been. In the USA, the Department of Labour estimates that 65 percent of 6 year olds starting out in school will eventually find careers that haven’t been invented yet. The labour market therefore is finding it difficult to keep up. Crucial for all of us is the willingness and ability to learn, unlearn and relearn.

In the midst of this technological whirlwind it is easy to forget that underpinning all of this is the human story. We are emotional beings bound by constants of motivation and traits of character which have been with us since day one. Communication and connecting with both hearts and minds (storytelling) has never been more important.

YouTube Preview Image

by Greg

Creative Enterprise

00:08 in Blog by Greg

This week we launched a new initiative – in partnership with Digital Derry and the Office of Innovation – which will enable our students to attain course credits for realising a digital business. Final year students in both Creative Technologies (CT) and Design have the option of choosing the Innovation & Creative Enterprise (ICE) module and in so doing will be guided through all the necessary stages of developing a viable business from conceptualisation and market research through to the development of a prototype and speaking to business professionals and investors.

The initiative represents a multi-faceted and long term commitment to the demystification of what it takes to ultimately become your own boss. Today’s digital talent represents tomorrow’s leaders (both in business and culture) and they’re increasingly much less interested in dedicating their creativity and time to slow moving machines. After all a business start-up is by definition a creative endeavour. Therefore, a strong focus is placed on creativity and design thinking (and doing) in parallel with skills in real-world networking, finance, legal and more traditional subjects areas such as contemporary and digital culture.

Another key component involves a shift in thinking around the subject of failure. For most of us we learn by doing and making mistakes is a fundamental part of that process. Successful entrepreneurs will repeatedly speak of the importance of making mistakes early on and learning from them quickly. As one who has had the great fortune of visiting the west coast of the United States and the phenomenon that is Silicon Valley, the can do attitude is quite literally infectious and the mantra so often repeated is let me fail fast. This of course is not a wish for failure but an expression of the importance of gathering data (results) as quickly as possible so as to enable rapid improvement of the product or service.

We know that within the context of all things ‘digital’ the extraordinary pace of development and change means that successful ventures are most often born from a vibrant network of talent – designers, developers, producers – advisers and mentors. The support of Digital Derry in this respect is invaluable.

Start-up activity and entrepreneurship sit at the centre of a strong and healthy economy. I’m confident that our graduates, emboldened with the support of the network and a spirit of DIY, will go on to form many future and groundbreaking companies.

Allow me to finish in the words of a recently departed innovator, visionary and artist:

“Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma — which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.”

Steve Jobs 1955 – 2011

by paul

Digital Dreams

13:08 in Blog, Digtal Musings, Paul by paul

At a seminar I attended on Friday last a presentation was given by two 15 year-olds about their digital day. I realise that they were designing an input for a particular audience but what I found interesting was the notion that interface with the so-called digital space is something that only happens in our waking hours. When I sleep I do not switch off my machines – become in some way non-interactive – and I expect the ‘stuff’ to keep working even if I am not there to mediate it. If I cannot sleep, or am awakened from sleep for whatever reason, the first thing I will do is search out the machines to check what or who I have been neglecting.

This, I am sure, is not extraordinary. What is extraordinary is our constant need to see this as an exchange with the digital world as though that space is something which is not now intrinsic to our existence, with apologies to Descartes, a kind of ‘iPhone, therefore I am’. Hence my conviction, held now for many months, that we need to stop using the term ‘digital’. It has become meaningless except as a crutch for those who are too removed or too lazy to figure out that it is analogue which has become rare enough to be highlighted as being in play at any given time. Digital is quite simply where we expect our lives to be lived.

What I am also convinced about is that the bulk of our existence in this space will also be lived on the mobile phone in whatever form. Anyone who refers to their phone as a phone has obviously not yet entered the smartphone universe. Those who have will know that their phone is not a phone but a small object of desire (I will be charitable and include Android systems in this equation) linking them seamlessly to the information and contacts without which daily life would be less palatable if not insufferable.

It amuses me, then, to hear Wired magazine heralding the end of the Internet. If anything it is the beginning of the new internet, or the new phase of the internet where the new levels of sophistication in the manipulation of data will make the basic search engine seem as antiquated as Stephenson’s Rocket. But it will also necessitate the building of teams capable of creating and harvesting that data, teams which will include programmers, developers, designers, content creatives and policy wonks. Now I wonder what kind of course is going to produce that heady mix?