Σα. Απρ 27th, 2024

test01

Computational thinking’s influence on research and education for all

ABSTRACT Computer science has produced, at an astonishing and breathtaking pace, amazing
technology that has transformed our lives with profound economic and societal impact. In the course
of the past ten years, we have come to realize that computer science offers not just useful software and
hardware artifacts, but also an intellectual framework for thinking, what I call “computational thinking”.
Everyone can benefit from thinking computationally. My grand vision is that computational thinking will
be a fundamental skill—just like reading, writing, and arithmetic—used by everyone by the middle of the
21st Century.

KEYWORDS Computational Thinking, Education, Curriculum.

  1. WHAT IS COMPUTATIONAL THINKING?
    This article describes how pervasive computational thinking has become in research and education.
    Researchers and professionals in an increasing number of fields beyond computer science have been reaping benefits from computational thinking. Educators in colleges and universities have changed undergraduate curricula to promote computational thinking to all students, not just computer science majors. Before elaborating on this progress toward my vision, let’s begin with describing what is meant by computational thinking. 7

1.1. Definition
I use the term “computational thinking” as shorthand for “thinking like a computer scientist” (Wing, 2006).
To be more descriptive, however, I now define computational thinking (with input from Al Aho at Columbia University, Jan Cuny at the National Science Foundation, and Larry Snyder at the University of Washington)
as follows:
Computational thinking is the thought processes involved in formulating a problem and
expressing its solution(s) in such a way that a computer—human or machine—can effectively
carry out.
Informally, computational thinking describes the mental activity in formulating a problem to admit a
computational solution. The solution can be carried out by a human or machine. This latter point is important.
First, humans compute. Second, people can learn computational thinking without a machine. Third, “
computers” of today combine the intelligence of humans and the intelligence of machines. Also, computational thinking is not just about problem solving, but also about problem formulation.
In this definition I deliberately use technical terms. By “expressing” I mean creating a linguistic representation for the purpose of communicating a solution to others, people or machines. The expressiveness of a language, e.g., programming language, can often make the difference between an elegant or inelegant solution, e.g., between a program provably absent of certain classes of bugs or not. By “effective,” in the context of the Turing machine model of computation, I mean “computable” (or “decidable” or “recursive”); however, it is open research to revisit models of computation, and thus the meaning of “effective,” when we consider what is computable by say biological or quantum computers (Wing, 2008) or what is solvable
by humans (Levin, 2013; Wing, 2008).

1.2. Abstraction is Key
Computer science is the automation of abstractions (Aho & Ullman, 1992)1. So, the most important and
high-level thought process in computational thinking is the abstraction process. Abstraction is used in defining patterns, generalizing from specific instances, and parameterization. It is used to let one object stand for many.
It is used to capture essential properties common to a set of objects while hiding irrelevant distinctions among them. For example, an algorithm is an abstraction of a process that takes inputs, executes a sequence of steps, and produces outputs to satisfy a desired goal. An abstract data type defines an abstract set of values and operations for manipulating those values, hiding the actual representation of the values from the user of the abstract data type. Designing efficient algorithms inherently involves designing abstract data types.
Abstraction gives us the power to scale and deal with complexity. Applying abstraction recursively allows
us to build larger and larger systems, with the base case (at least for traditional computer science) being
bits (0’s and 1’s). In computing, we routinely build systems in terms of layers of abstraction, allowing us
to focus on one layer at a time and on the formal relations (e.g., “uses,” “refines” or “implements,”
“simulates”) between adjacent layers. When we write a program in a high-level language, we are building on lower layers of abstractions. We do not worry about the details of the underlying hardware, the operating system, the file system, or the network; furthermore, we rely on the compiler to correctly implement the semantics of the language. The narrow-waist architecture of the Internet demonstrates the effectiveness and robustness of appropriately designed abstractions: the simple TCP/IP layer at the middle has enabled a multitude of unforeseen applications to proliferate at layers above, and a multitude of unforeseen platforms, communications media, and devices to proliferate at layers below.

[1] In this book, the authors define Computer Science to be “The Mechanization of Abstraction”.

                                                                                                          8
  1. COMPUTATIONAL THINKING AND OTHER DISCIPLINES
    Computational thinking has already influenced the research agenda of all science and engineering disciplines.
    Starting decades ago with the use of computational modeling and simulation, through today’s use
    of data mining and machine learning to analyze massive amounts of data, computation is recognized as the third pillar of science, along with theory and experimentation (President’s Information Technology Advisory Council, 2005).
    Consider just biology. The expedited sequencing of the human genome through the “shotgun algorithm” awakened the interest of the biology community in computational concepts (e.g., algorithms and data structures) and computational approaches (e.g., massive parallelism for high throughput), not just computational artifacts (e.g., computers and networks). In 2005, the Computer Science and Telecommunications Board of the National Research Council (NRC) published a 468-page report laying out a research agenda to explore the interface between biology and computing (National Research Council, 2005). In 2009, the NRC Life Sciences Board’s study on Biology in the 21st Century recommends that “within the national New Biology Initiative, priority be given to the development of the information technologies and sciences that will be critical to the success of the New Biology” (National Research Council, 2009). Now at many colleges students can choose to major in computational biology.
    The volume and rate at which scientists and engineers are now collecting and producing data —through instruments, experiments, simulations, and crowd-sourcing—are demanding advances in data analytics, data storage and retrieval, as well as data visualization. The complexity of the multi-dimensional systems that scientists and engineers want to model and analyze requires new computational abstractions. These are just two reasons that every scientific directorate and office at the National Science Foundation participated in the Cyber-enabled Discovery and Innovation2, or CDI, program, an initiative started when I first joined NSF in 2007. CDI was in a nutshell “computational thinking for science and engineering.”
    Computational thinking has also begun to influence disciplines and professions beyond science and
    engineering. For example, areas of active study include algorithmic medicine, computational economics, computational finance, computational law, computational social science, digital archaeology, digital arts, digital humanities, and digital journalism. Data analytics is used in training Army recruits, detecting email spam and credit card fraud, recommending movies and books, ranking the quality of services, and personalizing coupons at supermarket checkouts. Machine learning is used by every major IT company for understanding human behavior and thus to tailor a customer’s experience to his or her own preferences.
    Every industry and profession talks about Big Data and Cloud Computing. The combination of machine learning, especially deep learning, and large-scale compute infrastructure, built from CPUs, GPUS and FPGAs, underlies the current resurgence of artificial intelligence, with no end in sight. Cognitive tasks, such as image classification and speech recognition, performed computationally now reach human-level performance. Every industry —automotive, finance, healthcare, journalism, law, manufacturing— is being disrupted by technological advances in computer science. People working in those industries are going to have to think computationally.
  2. COMPUTATIONAL THINKING AND EDUCATION
    In the early-2000s, computer science had a moment of panic. Undergraduate enrollments were dropping.
    Computer science departments stopped hiring new faculty. One reason I wrote my 2006 CACM article on computational thinking was to inject some positive thinking into our community. Rather than bemoan the

[2] http://www.nsf.gov/crssprgm/cdi/

                                                                                                       9

declining interest in computer science, I wanted us to shout to the world about the joy of computing, and
more importantly, about the importance of computing. Sure enough, today enrollments are skyrocketing
(again). Demand for graduates with computing skills far exceeds the supply; six-figure starting salaries
offered to graduates with a B.S. in Computer Science are not uncommon.

3.1. Undergraduate Education
Campuses throughout the United States and abroad are revisiting their undergraduate curriculum in
computer science. They are changing their first course in computer science to cover fundamental principles and concepts, not just programming. For example, Carnegie Mellon revised its undergraduate first-year courses to promote computational thinking for non-majors (Bryant, Sutner, & Stehlik, 2010). Harvey Mudd redesigned its introductory course with stellar success, including increasing the participation of women in computing (Klawe, 2013). At Harvard, «In just a few short years CS50 has rocketed from being a middling course to one of the biggest on campus, with nearly 700 students and an astounding 102-member staff» (Farrell, 2013). Harvard’s CS50 is now Yale’s most popular class (Annear, 2015). For MIT’s introductory course to computer science, Eric Grimson and John Guttag say in their opening remarks «I want to begin talking about the concepts and tools of computational thinking, which is what we’re primarily going to focus on here. We’re going to try and help you learn how to think like a computer scientist» (Grimson & Guttag, 2008).
Many such introductory courses are now offered to or required by non-majors to take. Depending on the
school, the requirement might be a general requirement (CMU) or a distribution requirement, e.g., to
satisfy a science and technology (MIT), empirical and mathematical reasoning (Harvard), or a quantitative
reasoning (Princeton) requirement.

3.2. What about K-12?
Not till computational thinking is taught routinely at K-12 levels of education will my vision be fully
realized. Surprisingly, as a community, we have made faster progress at spreading computational thinking
to K-12 than I had expected. We have professional organizations, industry, non-profits, and government
policymakers to thank.
The College Board, with support from NSF, designed a new Advanced Placement (AP) course that covers
the fundamental concepts of computing and computational thinking. This new course, called Computer
Science Principles3, launched in Fall 2016. Not coincidentally, the changes to the Computer Science AP
course are consistent with the changes in introductory computer science courses taking place now on
college campuses.
Industry is also promoting the importance of computing for all. Since 2006, with help from Google and
later Microsoft, Carnegie Mellon has held summer workshops for high school teachers called “CS4HS.”
These workshops are designed to deliver the message that there is more to computer science than computer programming. CS4HS spread in 2007 to UCLA and the University of Washington. Since 2009, under the auspices of Google, CS4HS4 had spread to over 40,000 teachers in over 40 countries reaching over one million students.
Launched in 2013, code.org5 is a non-profit organization promoting making computer science accessible to all students. It is supported by donations from companies, foundations, and individuals. Many companies

[3] https://advancesinap.collegeboard.org/stem/computer-science-principles
[4] https://www.cs4hs.com/
[5] https://code.org/

                                                                                                  10

support code.org out of their own need for more professionals trained with computer science skills. Code.
org hosts a rich suite of educational materials and tools that run on many platforms, including smart phones and tablets. It lists local high schools and camps throughout the US where students can learn computing.
Computer science has also gotten attention from government leaders. President Barack Obama, in his
2016 State of the Union address, advocated for the nation to provide «pre-K for all, offering every student
the hands-on computer science and math classes that make them job-ready on day one». Included in the
Computer Science for All Initiative6 he announced on January 30, 2016, is $120 million from the National
Science Foundation. This money will be used to train as many as 9,000 more high school teachers to teach
computer science and integrate computational thinking into their curriculum.
Computational thinking has also spread internationally. In January 2012, the British Royal Society
published a report that says that «Computational thinking’ offers insightful ways to view how information
operates in many natural and engineered systems» and recommends that «Every child should have the
opportunity to learn Computing at school» (“School” in the UK is the same as K-12 in the US.) Since that
report, the UK Department for Education changed its national curriculum to include the mandatory study
for computing starting Fall 2014 by all K-12 students in the UK (UK Department for Education, 2013).
Much of the legwork behind this achievement was accomplished by the grassroots effort called “
Computing at School” [7].
More recently, the Danish Growth Council published a 120-page report (The Danish Growth Council,
2016) with recommendations to the Danish government that include making computational thinking a
mandatory component in education at all levels, from primary school through adult education. The report
recognizes that «the proper digital and analytical level of knowledge [be] secured among teachers throughout the educational system” and that “computational thinking must be part of the educational food chain».
Other countries are also making rapid strides in the same direction. The Australian National Curriculum
includes a Digital Technologies subject for K-10 students in which they «use computational thinking and
information systems to define, design and implement digital solutions». China’s Ministry of Education is
discussing plans for making computational thinking be a required core competency for high school graduation.
I am aware of similar efforts in Japan and Korea.

  1. PROGRESS SO FAR AND WORK STILL TO DO
    Ten years after the publication of my CACM Viewpoint, how far have we come? We have come a long
    way, along all dimensions: computational thinking has influenced the thinking in many other disciplines and many professional sectors; computational thinking, through revamped introductory computer science courses, has changed undergraduate curricula. We are making inroads in K-12 education worldwide.
    While we have made incredible progress, our journey has just begun. We will see more and more disciplines make scholarly advances through the use of computing. We will see more and more professions transformed by their reliance on computing for conducting business. We will see more and more colleges and universities requiring an introductory computer science course to graduate. We will see more and more countries adding computer science to K-12 curricula.
    Practical challenges and research opportunities remain. The main practical challenge is that we do not have enough K-12 teachers trained to teach computer science to K-12 students. I am optimistic that, over time,

[6] https://www.whitehouse.gov/the-press-office/2016/01/30/fact-sheet-president-obama-announces-computer-science-all-initiative-0
[7] http://www.computingatschool.org.uk/

                                                                                                         11

we will solve this problem.
There also are interesting research questions that I would encourage computer scientists to pursue, working with the cognitive and learning sciences communities. First, what computer science concepts should be taught when, and how?
Consider an analogy to mathematics. We teach numbers to 5-year-olds, algebra to 12-year-olds and calculus to 18-year-olds. We have somehow figured out the progression of concepts to teach in mathematics, where learning one new concept builds on understanding the previous concept, and where the progression reflects the progression of mathematical sophistication of a child as he or she matures.
What is that progression in computer science? For example, when is it best to teach recursion? Children
learn to solve the Towers of Hanoi puzzle (for small n) and in history class we teach “divide and conquer”
as a strategy for winning battles. But is the general concept better taught in high school? We teach long
division to 9-year-olds in 4th grade, but we never utter the word “algorithm.” And yet the way it is taught,
long division is just an algorithm. Is teaching the general concept of an algorithm too soon for a 4th grader?
More deeply, are there concepts in computing that are innate and do not need to be formally learned?
Second, we need to understand how best to use computing technology in the classroom. Throwing computers in the classroom is not the most effective way to teach computer science concepts. How can we use technology to enhance the learning and reinforce the understanding of computer science concepts? How can we use technology to measure progress, learning outcomes and retention over time? How can we use technology to personalize the learning for individual learners, as each of us learn at a different pace and have different cognitive abilities?
We have made tremendous progress in injecting computational thinking into research and education of all fields in the last ten years. We still have a way to go, but fortunately, academia, industry and government forces are aligned toward realizing the vision of making computational thinking commonplace.