Skip to content ↓

Computer Science

SUBJECT overview

A high-quality computing education equips learners to use computational thinking and creativity to understand and change the world. Computing has deep links with mathematics, science and design and technology, and provides insights into both natural and artificial systems. The core of computing is computer science, in which learners are taught the principles of information and computation, how digital systems work and how to put this knowledge to use through programming. Building on this knowledge and understanding, learners are equipped to use information technology to create 

unit overview - autumn term unit  1.1

The characteristics of contemporary processors, input, output and storage devices

Skills

Be able to describe the structure of the central processing unit and the role of its components and registers

Be able to code elementary assembly language programs

Be able to describe the process of program execution (i.e. the fetch – decode -

execute cycle) and its effect on registers

Be able to explain how the component features of a CPU (multiple cores, cache, clock speed) affect its performance

Be able to explain the features of CISC and RISC processors and how each one is appropriate for different scenarios

Be able to explain how multicore and parallel systems are appropriate for different technological scenarios

Be able to justify how a range of input, output and storage devices are utilised to solve a range of technological problems.

Be able to explain how different storage technologies (magnetic, flash and optical) are utilised to solve a range of technological problems.

Be able to explain the usages of RAM, ROM and virtual storage in computer systems.

Knowledge

Understand the structure and function of the processor:

  • The Arithmetic and Logic Unit; ALU, Control Unit and Registers (Program Counter; PC, Accumulator; ACC, Memory Address Register; MAR, Memory Data Register; MDR, Current Instruction Register; CIR). Buses: data, address and control: how this relates to assembly language programs
  • The fetchdecode-execute cycle, including its effect on registers
  • The factors affecting the performance of the CPU, clock speed, number of cores, cache
  • The use of pipelining in a processor to improve efficiency
  • Von Neumann, Harvard and contemporary processor architecture

Understand types of processor:

  • The differences between, and uses of CISC and RISC processors
  • GPUs and their uses (including those not related to graphics)
  • Multicore and parallel systems

Understand input, output and storage:

  • How different input, output and storage devices can be applied to the solution of different problems
  • The uses of magnetic, flash and optical storage devices
  • RAM and ROM
  • Virtual storage

 

Modern computer systems have, at their heart, a central processing unit (CPU), which carries out each instruction it receives from a user.  In the space of one second, it is capable of executing billions of mathematical operations and logical evaluations.  These processors are feats of modern engineering and make the greatest contribution to the performance of any computer.

As processor technology develops, it is increasingly common to find them embedded in a wide range of everyday devices, all of which have embedded computers. Cars, washing machines, cameras, watches, televisions and refrigerators have all been enhanced with processors; so have running shoes, door handles, smoke alarms, kettles and dog collars.

Although processor architecture has been refined over time, each iteration is made of key components. In Comp 1.1, you will learn about how two different processor architectures fetch instructions from RAM, then decode and execute them.  In addition, you will learn how pipelining can make this process more efficient.

unit overview - autumn term unit  1.2

Software and software development

Skills

Describing the purpose and function of operating systems

Explain the processes through which an operating system manages memory

Describe the process through which an operating system handles a system interrupt

Describe the various methods which an operating system uses to schedule tasks

Classify the types of operating system

Describe the role of the BIOS, device drivers and virtual machines in a computer system

Justify the selection of different types of applications and source code to solve technological problems.

Describe the process of translating low- and high-level programs into machine code for execution by a CPU.

Be able to apply a range of common programming techniques in Python so solve common programming problems.

Be able to apply assembly language mnemonics in a range of simple LMC programs to solve basic mathematical problems.

Knowledge

Understand operating systems:

  • The need for, function and purpose of operating systems
  • Memory Management (paging, segmentation and virtual memory)
  • Interrupts, the role of interrupts and Interrupt Service Routines (ISR), role within the FetchDecode-Execute Cycle
  • Scheduling: round robin, first come first served, multilevel feedback queues, shortest job first and shortest remaining time
  • Distributed, embedded, multitasking, multi-user and real time operating systems
  • BIOS
  • Device drivers
  • Virtual machines, any instance where software is used to take on the function of a machine including executing intermediate code or running an operating system within another

Understand applications generation:

  • The nature of applications, justifying suitable applications for a specific purpose
  • Utilities
  • Open source vs closed source
  • Translators: Interpreters, compilers and assemblers
  • Stages of compilation (lexical analysis, syntax analysis, code generation and optimisation)
  • Linkers and loaders and use of libraries

Understand high- and low-level programming:

  • Procedural programming language techniques (program flow, variables and constants, procedures and functions, arithmetic, Boolean and assignment operators, string handling, file handling)
  • Assembly language (including following and writing simple programs with Little Man Computer)
  • Modes of addressing memory (immediate, direct, indirect and indexed)
  • Objectoriented languages with an understanding of classes, objects, methods, attributes, inheritance, encapsulation and polymorphism

Rationale

Software can classified according to the various functions it performs for a computer system. This unit unpacks how systems software runs ‘under the hood’, while applications and utilities are run by users to perform tasks. There is an impressive division of labour in computer software, meaning that programmers are able to simply trust that the files they create will be saved into the correct folders, that keyboard and mouse input will be handled appropriately, and that their applications will run at appropriate speeds.

This unit considers each of these processes, and pays close attention to how operating systems schedule tasks, how translators turn programs into machine code instructions, and how different generations of programmers have written instructions for their respective CPUs.

unit overview - autumn term unit  1.3

Exchanging data

Skills

Be able to design and model database structures to solve a range of data requirements.

Be able to classify networks according to their hardware and connectivity requirements.

Be able to describe how data travels from device to device on a network with reference to protocols and layers.

Be able to code simple programs using HTML, CSS and JavaScript.

Be able to justify decisions utilising compression algorithms.

Knowledge

Understand Compression, Encryption and Hashing:

  • Lossy vs Lossless compression
  • Run length encoding and dictionary coding for lossless compression
  • Symmetric and asymmetric encryption
  • Different uses of hashing

Understand databases:

  • Relational database, flat file, primary key, foreign key, secondary key, entity relationship modelling
  • Methods of capturing, selecting, managing and exchanging data
  • Normalisation to 3NF
  • SQL – Interpret and modify
  • Referential integrity
  • Transaction processing, ACID (Atomicity, Consistency, Isolation, Durability), record locking and redundancy

Understand networks:

  • Characteristics of networks and the importance of protocols and standards
  • Internet structure (The TCP/IP stack, DNS, Protocol layering, LANs and WANs, Packet and circuit switching)
  • Network security and threats, use of firewalls, proxies and encryption
  • Network hardware
  • Clientserver and peer to peer

Understand web technologies:

  • HTML, CSS & JavaScript
  • Search engine indexing
  • PageRank algorithm
  • Server and client side processing

Rationale

The modern computer system is connected to a local (and probably global) network, via wired or wireless transmission media. This unit considers how the various interconnected devices on these networks interconnect and communicate with one another.

In particular, we engage with the notion of communication protocols – the rules for exchanging data on networks.  During this unit, we examine how data passes reliably from one device to another over enormous distances, almost instantaneously, by observing the various protocols at each layer.

We also consider how large volumes of data are quickly transmitted over the Internet after first being compressed. Thanks to some useful compression algorithms, we are now able to stream high quality sports games live, enjoy live video with many participants on our tablet devices and download images from the cloud onto our smartphones. This unit considers some methods of compressing data so this can be made possible.

unit overview - autumn term unit 2.1

Elements of computational thinking

Skills

Being able to model problems by hiding away complexity and focusing on the most important features.

Being able to plan software by expressing the inputs, outputs and preconditions.

Being able to design solutions using top-down modular design.

Being able to design algorithms that apply logic in appropriate sequence, considering how decision points and loops enable branching and iteration.

Being able to identify which parts of complex systems occur simultaneously.

Knowledge

  • Understanding thinking abstractly (the nature of abstraction, the need for abstraction,  he differences between an abstraction and reality, devise an abstract model for a variety of situations)
  • Understanding thinking ahead (identify the inputs and outputs for a given situation; determine the preconditions for devising a solution to a problem; the nature, benefits and drawbacks of caching; the need for reusable program components)
  • Understanding thinking procedurally (Identify the components of a problem; identify the components of a solution to a problem; determine the order of the steps needed to solve a problem; identify subprocedures necessary to solve a problem)
  • Understanding thinking logically (identify the points in a solution where a decision has to be taken; determine the logical conditions that affect the outcome of a decision; determine how decisions affect flow through a program)
  • Understanding thinking procedurally (determine the parts of a problem that can be tackled at the same time; outline the benefits and tradeoffs that might result from concurrent processing in a particular situation)

Rationale

This unit considers some frameworks for thinking about computational problems, so that computer scientists might share a common approach to complexity.

Sometimes, we are able to use our solutions to some problems to solve a wider set of problems.  A good computer scientist will seek to adapt their code in wider applications.  They do this by separating concrete uses of their code from the algorithms they contain.  This thinking skill is known as abstraction.

A good computer scientist also develops the ability to think ahead, so that they attempt to form a picture of their final solution. They consider how their system will work with existing technologies, how it will be utilised by its users, and what it will take to create it.

Practising Computer Science also involves thinking about the steps that a solution will involve.  No matter what the problem, its solution will involve a bringing together of instructions, decisions, loops and logic. Being able to arrange these constructs and logic – thinking procedurally and logically – is a cornerstone of computational thinking.

To consider multiple core processors, big data, computer gaming, operating systems and weather modelling is to see that multiple processes are executed simultaneously. Recognising that parts of problems can be solved at the same time involves thinking procedurally.

These thinking skills are refined through considerable experience working with software. Each skill resonates with anybody who ever programmed a computer and attempted to write software,

unit overview - autumn term unit 2.2

Problem solving and programming

Skills

Being able to apply the full range of A Level programming techniques to write computer programs that solve real world problems.

Writing programs which:

  • Apply the three main programming constructs – sequence, selection & iteration – to solve common computing problems
  • Use advanced recursion techniques
  • Use subprograms appropriately (with and without parameters and return values)
  • Are developed in a range of IDEs
  • Apply OOP techniques

Be able to apply higher-level computational thinking to solve problems:

  • Mapping out larger problems in terms of their subcomponents
  • Applying data structures and iteration to divide and conquer problems

Knowledge

Programming techniques:

  • Programming constructs: sequence, iteration, branching
  • Recursion, how it can be used and compares to an iterative approach
  • Global and local variables
  • Modularity, functions and procedures, parameter passing by value and by reference
  • Use of an IDE to develop/debug a program
  • Use of object oriented techniques

Computational methods:

  • Features that make a problem solvable by computational methods
  • Problem recognition
  • Problem decomposition
  • Use of divide and conquer
  • Use of abstraction
  • Learners should apply their knowledge of: backtracking, data mining, heuristics, performance modelling, pipelining, visualisation to solve problems

Rationale

Programming is the thinking process of creating a sequence of instructions for a computer to execute. Once we start programming, we quickly notice just how many high-level languages there are at our disposal.  On this course, we prioritise one, Python, because it enables us to dive deep into all the complexities of the A Level and emerge knowing how to create useful software.  It would be for the university CS student to increase the breadth and study more languages; this course prioritises depth over breadth.

During the A Level, we work through sequence, selection and iteration to create the best software we can. Soon, we scale up our thinking to use sub-programs to make modular, maintainable programs; and scale it up again to write programs applying the object-oriented paradigm. By the end of the course, you will have created a fully working computer game with an engaging interface, and possess a set of skills to use in any professional domain.

As we make this progress, we grapple with greater problems along the way. Here, we learn how to view problems like computer scientists, applying familiar techniques of computational thinking so that our software development follows the process of really thinking it through as we go.

unit overview - spring term unit 1.4

Data types, data structures and algorithms

Skills

Be able to program in a high-level language using the range of data types.

Be able to represent and manipulate numeric data in binary form (positive, negative, integer and floating point).

Be able to represent and manipulate numeric data in hexadecimal form.

Be able to perform bitwise manipulation using AND, OR and so on.

Be able to program in a high-level language using the range of mentioned data structures (see below).

Be able to create each of the mentioned data structures (see below), as well as traverse, add data to and remove data from each.

Be able to create Boolean logic diagrams (including  D type flip flops, half and full adders)

Be able to express complex logic problems in simplified Boolean algebra using Karnaugh maps

Be able to use  De Morgan’s Laws, distribution, association, commutation, double negation in Boolean algebra

Knowledge

Understand data types:

  • Primitive data types, integer, real/floating point, character, string and Boolean
  • Represent positive integers in binary
  • Use of sign and magnitude and two’s complement to represent negative numbers in binary
  • Addition and subtraction of binary integers
  • Represent positive integers in hexadecimal
  • Convert positive integers between binary hexadecimal and denary
  • Positive and negative real numbers using normalised floating point representation
  • Representation and normalisation of floating point numbers in binary
  • Floating point arithmetic, positive and negative numbers, addition and subtraction
  • Bitwise manipulation and masks: shifts, combining with AND, OR, and XOR
  • How character sets (ASCII and UNICODE) are used to represent text

Understand data structures:

  • Arrays (of up to 3 dimensions), records, lists, tuples
  • The following structures to store data: linkedlist, graph (directed and undirected), stack, queue, tree, binary search tree, hash table
  • How to create, traverse, add data to and remove data from the data structures mentioned above

Understand Boolean algebra:

  • Define problems using Boolean logic
  • Manipulate Boolean expressions, including the use of Karnaugh maps to simplify Boolean expressions
  • Use the following rules to derive or simplify statements in Boolean algebra: De Morgan’s Laws, distribution, association, commutation, double negation
  • Using logic gate diagrams and truth tables
  • The logic associated with D type flip flops, half and full adders

Rationale

Computers use electricity to represent data, the current having a high and a low state. Once digitised, this becomes the fundamental unit of computing: on and off, True and False, 1 and 0. This unit examines the binary system of 1s and 0s, which combine to represent not only positive integers, but also anything we experience on our computers. This includes images and video, where millions of bytes are grouped together to form true colour images, and quickly enough to create games. It also includes sound, where analogue audio is sampled so frequently, that it can be replayed in almost true sound.

Much of the focus of this unit is mathematical, where we learn about how computers store negative and decimal point values using binary, and how these values are then computed in mathematical operations and logical evaluations.

The unit also considers how items of data are combined in data structures. The choice of data structure (static, dynamic, one-dimensional, two-dimensional, stack, queue, binary tree, graph, hash table, and so on) is a critical one for a real-world computer scientist.  This is because the decision has considerable implications for the run-time and relative efficiency of a program. This unit features opportunities to program using a range of data structures, so that the student becomes familiar with each structure, how data is stored within it, and its functions are used to manipulate the data it stores.

unit overview - spring term unit 1.5

Legal, moral, cultural and ethical issues

Skills

Be able to investigate and discuss concerns regarding the development, use and impact of computing technology such as:

  • Legal
  • Moral
  • Cultural
  • Ethical

Be able to apply balanced arguments to discuss contemporary issues arising from using technology.

Be able to write balanced essays, which consider how digital technology has been harnessed and how it has affected individuals and society.

Be able to consider any emergent technological issue in terms of its impacts.

Be able to describe the purpose of each piece of  Computing legislation.

Knowledge

Understand Computing related legislation:

  • Data Protection Act 1998
  • Computer Misuse Act 1990
  • Copyright Designs and Patents Act 1988
  • Regulation of Investigatory Powers Act, 2000

Understand moral and ethical issues – the individual moral, social, ethical and cultural opportunities and risks of digital technology:

  • Computers in the workforce
  • Automated decision making
  • Artificial intelligence
  • Environmental effects
  • Censorship and the Internet
  • Monitor behaviour
  • Analyse personal information
  • Piracy and offensive communications
  • Layout, colour paradigms and character sets

Rationale

The ubiquity of computers in modern society has led to a proliferation of computer-related crimes. The digitization and networking of many aspects of our lives have led to some undesirable consequences, both by design and by default. These include identify theft, harassment, bank fraud, and cybercrime, all of which has seen new legislation attempting to protect the public from harm.  These consequences also include societal fragmentation, as those who have mastered new technologies are in a far stronger position than those who have not – while those without technology at all are unable to even participate in the digital society.

The computerisation of everything has created a large cluster of ethical issues.  We use computers in the workplace, which at first augment work processes, until they become so powerful that they lead to human redundancies.  We use computers for everything until our landfills are full of obsolete models, and our non-renewable energy is running out just to power all this hardware.  We can send video data instantly over our networks, but the next thing, our lives are observed by an all-seeing eye that knows too much about us.

This unit is about understanding how we can harness computers and stay mindful of how they can harm us.

unit overview - spring term unit 2.3

Algorithms

Skills

Be able to analyse a given situation and design an algorithm for it

Be able to classify different algorithms (in terms of execution time and space) according to their suitability for a given task and data set

Be able to apply methods in order to determine the efficiency of different algorithms in terms of Big O notation (constant, linear, polynomial, exponential and logarithmic complexity)

Be able to describe the comparison points between the complexity of algorithms

Be able to demonstrate (in some cases write) the algorithms for the main data structures, (stacks, queues, trees, linked lists, depth-first (post-order) and breadth-first traversal of trees).

Be able to demonstrate (in some cases write) the standard algorithms (bubble sort, insertion sort, merge sort, quick sort, Dijkstra’s shortest path algorithm, A* algorithm, binary search and linear search)`

Knowledge

Understand how to analyse and design algorithms for a given situation.

Understand the suitability of different algorithms for a given task and data set, in terms of execution time and space

Understand the measures and methods to determine the efficiency of different algorithms, Big O notation (constant, linear, polynomial, exponential and logarithmic complexity)

Understand the comparison points between the complexity of algorithms

Understand the algorithms for the main data structures, (stacks, queues, trees, linked lists, depth-first (post-order) and breadth-first traversal of trees).

Understand the standard algorithms (bubble sort, insertion sort, merge sort, quick sort, Dijkstra’s shortest path algorithm, A* algorithm, binary search and linear search)

Rationale

By this point, you will have written a wide range of computer programs for numerous purposes, and have impressed yourself with how you mastered some complex problems. These may be called algorithms – and in this unit, you learn about some of the main algorithms in the field of computer science. In addition, the unit considers how any algorithm is analysed in terms of its memory requirements and execution time. In real world computer systems, this problem is a crucial one, as it enables a computer scientist to see if their algorithm is good enough, or if another one might do the job better.

To develop this understanding, we consider the stages of designing an algorithm: we must first define it, then design it, develop it, test it and evaluate it. This is, in fact, an algorithm for creating algorithms. These stages of development are worth mastering, as they make for a great Y13 project, but beyond that, they are a great foundation for considering how to make anything. As the saying goes, if something is worth making, it is worth making properly: software is no different, so let’s understand a process for making software that works.

unit overview - summer term unit 3.1

Programming project

Skills

Be able to undertake the following:

  • Break down a problem into smaller components
  • Justify a technological solution in terms of how it meets stakeholders’ requirements
  • Express a computing project in terms of requirements
  • Produce pseudocode for a decomposed problem which describes functions & data structures
  • Produce a test plan for the component parts of a solution
  • Program a large project in Python using OOP
  • Develop a large project component by component, applying an iterative development methodology
  • Test each component in isolation, as well as a whole
  • Describe how a complete project meets the requirements of its stakeholders

Knowledge

Understand how to analyse a problem in terms of identifying its scope, the requirements of its stakeholders, expressing its essential features and justifying a solution that feels appropriate.

Understand how to design a solution in terms of expressing its decomposed components, and how each one will be coded in terms of subprograms & data structures, as well as how each component will be tested.

Understand how to develop a solution to a complex computing problem, using an iterative development methodology.

Understand how to test and maintain a final computing solution, both in its component parts and its whole.

Rationale

In this practical component, students analyse, design, develop, test, evaluate and document a program written in Python. Students will apply the principles of computational thinking to a practical coding problem – quite commonly to create an original computer game. Here, students should take an agile development approach to the project development.

The project assessment criteria is organised into specific categories, and written up in a final report that documents the agile development process.

Historically, our students have been intensely proud of what they have produced in this project. It is here that the coding journey of the previous few years comes together, and produces a product greater than the sum of its parts. I look forward to working with you on yours.

knowledge organisers

A knowledge organiser is an important document that lists the important facts that learners should know by the end of a unit of work. It is important that learners can recall these facts easily, so that when they are answering challenging questions in their assessments and GCSE and A-Level exams, they are not wasting precious time in exams focusing on remembering simple facts, but making complex arguments, and calculations.

We encourage all pupils to use them by doing the following:

  • Quiz themselves at home, using the read, write, cover, check method.
  • Practise spelling key vocabulary
  • Further researching people, events and processes most relevant to the unit.