Welcome to Everything Computer Science

Computer Science (CS)

Computer Science (CS) is the study of the principles and use of computers. Simply put it's the science that deals with the theory and methods of processing information in digital computers, the design of computer hardware and software, and the applications of computers. A computer is simply a tool for a computer scientist, like a telescope for an astronomer.

CS Cheat Sheet CS Intro PDF

10 Reasons to Learn Computer Science

  1. Computing is part of everything we do!
  2. Expertise in computing enables you to solve complex, challenging problems.
  3. Computing enables you to make a positive difference in the world.
  4. Computing offers many types of lucrative careers.
  5. Computing jobs are here to stay, regardless of where you are located.
  6. Expertise in computing helps you even if your primary career choice is something else
  7. Computing offers great opportunities for true creativity and innovativeness.
  8. Computing has space for both collaborative work and individual effort.
  9. Computing is an essential part of well-rounded academic preparation.
  10. Future opportunities in computing are without boundaries.

11 Computer Science Specializations

  1. Artificial Intelligence (AI)
  2. Computer Engineering
  3. Computer Security
  4. Computer Programming
  5. Robotics
  6. Neural Networks
  7. Software Engineering
  8. Computer Graphics
  9. Machine Learning
  10. Computer Networks
  11. Computational Biology

Computer Science Advice for Students from Professor Hamzeh Roumani (4 min.)

StudentawardsInc

In this video Profressor Hamzeh Roumani gives a brief summary of computer science, and advice for students who are pursuing this major.



3 Great Insights of Computer Science:


  1. Gottfried Wilhelm Leibniz's, George Boole's, Alan Turing's, Claude Shannon's, and Samuel Morse's insight:
    All the information about any computable problem can be represented using only 0 and 1 (or any other bistable pair that can flip-flop between two easily distinguishable states, such as "on/off", "magnetized/de-magnetized", "high-voltage/low-voltage", etc.).

  2. Alan Turing's insight:
    There are only five actions that a computer has to perform in order to do "anything". Every algorithm can be expressed in a language for a computer consisting of only five basic instructions:
    • move left one location;
    • move right one location;
    • read symbol at current location;
    • print 0 at current location;
    • print 1 at current location.

  3. Corrado Böhm and Giuseppe Jacopini's insight:
    There are only three ways of combining these actions (into more complex ones) that are needed in order for a computer to do "anything".
    • sequence: first do this, then do that;
    • selection: IF such-and-such is the case, THEN do this, ELSE do that;
    • repetition: WHILE such-and-such is the case DO this.


Computer Science Unplugged - The Show (1 Hour )

csunplugged.org

This video shows an entertaining way to introduce Computer Science to school students.

Download Video Download PDF


Best Programming Languages to Learn

The following languages provide a reasonable mixture of paradigms and practical applications:

  • Javascript
    • JavaScript is a good representative of the semantic model popular in dynamic, higher-order languages such as Python, Ruby and Perl.As the native language of the web, its pragmatic advantages are unique.
  • Java
    • The primary advantage of Java application development is that it is free and its syntax bears resemblance to various C-based programming languages, making it easier for developers to understand and implement.
  • C
    • If you want to be able to do more than write a simple web app, C is a great language. If you want to write a great, fast game, C is again a great choice. You can write an entire OS in C.
  • C++
    • You can tap directly into the Windows API and work magic with it. You can program with objects and classes or abandon them altogether and make a C-style structured program.
  • PHP
    • PHP is a lot easier to get started with than you might think. By learning just a few simple functions, you are able to do a lot of things with your website. And once you know the basics, there are a wealth of scripts available on the internet that you only need to tweak a little to fit your needs.
  • Haskell
    • Haskell is the crown jewel of the Hindley-Milner family of languages. Fully exploiting laziness, Haskell comes closest to programming in pure mathematics of any major programming language.
  • Assembly
    • Learning compilers is the best way to learn assembly, since it gives the computer scientist an intuitive sense of how high-level code will be transformed.
  • Python
    • It’s an easy language that just about anyone can master in a short period of time. If you’re impatient and want to make quick scripts that deliver results (and you don’t mind very rudimentary debugging), then you should definitely explore it!
  • SQL
    • SQL is everywhere, and I'm not saying that because I want you to use it. It's just a fact. I bet you have some in your pocket right now. All Android Phones and iPhones have easy access to a SQL database called SQLite and many applications on your phone use it directly. It runs banks, hospitals, universities, governments, small businesses, large ones, just about every computer and every person on the planet eventually touches something running SQL. SQL is an incredibly successful and solid technology.

Scratch Game

Wizarding World of Harry Potter Game

randerson112358 scratch project

Here I created a simple game called Wizarding World of Harry Potter Game using the MIT programming language called "Scratch". This game is one of many different things you can create with minimum knowledge of Computer Science and programming. Hopefully I finish it soon !

Download Game

What every computer science major should know!

  • Portfolio versus Resume
    • A resume says nothing of a programmer's ability. Every computer science major should build a portfolio. A portfolio could be as simple as a personal blog, with a post for each project or accomplishment. A better portfolio would include per-project pages, and publicly browsable code (hosted perhaps on github or Google code). Contributions to open source should be linked and documented. A code portfolio allows employers to directly judge ability. GPAs and resumes do not.

  • Technical Communication
    • Lone wolves in computer science are an endangered species. Computer students must practice persuasively and clearly communicating their ideas to non-programmers. In smaller companies, whether or not a programmer can communicate his/her ideas to management may make the difference between the company's success and failure. Unfortunately, this is not something fixed with the addition of a single class (although a solid course in technical communication doesn't hurt). More classes need to provide students the opportunity to present their work and defend their ideas with oral presentations.

  • An Engineering core
    • Computer science is not quite engineering.But, it's close enough. Computer engineers will find themselves working with engineers, designers, managers,etc. They all need to speak the same language.

  • The Unix philosophy
    • Computer students should be comfortable with and practiced in the Unix philosophy of computing.The Unix philosophy (as opposed to Unix itself) is one that emphasizes linguistic abstraction and composition in order to effect computation.In practice, this means becoming comfortable with the notion of command-line computing, text-file configuration and IDE-less software development.

  • Systems Administration
    • Some computer scientists/engineers sneer at systems administration as an "IT" task(yes people, there is a huge difference). The thinking is that a computer student can teach himself/herself how to do anything a technician can do. This is true. (In theory.) Yet this attitude is misguided: computer science engineer must be able to competently and securely administer their own systems and networks. Many tasks in software development are most efficiently executed without passing through a systems administrator.

  • Programming languages
    • Programming languages rise and fall with the solar cycle. A programmer's career should not. While it is important to teach languages relevant to employers, it is equally important that students learn how to teach themselves new languages. The best way to learn how to learn programming languages is to learn multiple programming languages and programming paradigms. The difficulty of learning the nth language is half the difficulty of the (n-1)th. Yet, to truly understand programming languages, one must implement one. Ideally, every computer science major would take a compilers class. At a minimum, every computer science major should implement an interpreter.

  • Discrete mathematics
    • Students must have a solid grasp of formal logic and of proof. Proof by algebraic manipulation and by natural deduction engages the reasoning common to routine programming tasks. Proof by induction engages the reasoning used in the construction of recursive functions. Students must be fluent in formal mathematical notation, and in reasoning rigorously about the basic discrete structures: sets, tuples, sequences, functions and power sets.

  • Data structures and algorithms
    • Students should certainly see the common (or rare yet unreasonably effective) data structures and algorithms. But, more important than knowing a specific algorithm or data structure (which is usually easy enough to look up), students must understand how to design algorithms (e.g., greedy, dynamic strategies) and how to span the gap between an algorithm in the ideal and the nitty-gritty of its implementation.

  • Theory
    • A grasp of theory is a prerequisite to research in graduate school. Theory is invaluable when it provides hard boundaries on a problem (or when it provides a means of circumventing what initially appear to be hard boundaries). Computational complexity can legitimately claim to be one of the few truly predictive theories in all of computer "science." A computer student must know where the boundaries of tractability and computability lie. To ignore these limits invites frustration in the best case, and failure in the worst.

  • Architecture
    • There is no substitute for a solid understanding of computer architecture. Everyone should understand a computer from the transistors up. The understanding of architecture should encompass the standard levels of abstraction: transistors, gates, adders, muxes, flip flops, ALUs, control units, caches and RAM. An understanding of the GPU model of high-performance computing will be important for the foreseeable future.

  • Operating systems
    • Any sufficiently large program eventually becomes an operating system. As such, a person should be aware of how kernels handle system calls, paging, scheduling, context-switching, filesystems and internal resource management. A good understanding of operating systems is secondary only to an understanding of compilers and architecture for achieving performance. Understanding operating systems (which I would interpret liberally to include runtime systems) becomes especially important when programming an embedded system without one.

  • Networking
    • Given the ubiquity of networks, a person should have a firm understanding of the network stack and routing protocols within a network. The mechanics of building an efficient, reliable transmission protocol (like TCP) on top of an unreliable transmission protocol (like IP) should not be magic to a computer guy. It should be core knowledge. People must understand the trade-offs involved in protocol design--for example, when to choose TCP and when to choose UDP. (Programmers need to understand the larger social implications for congestion should they use UDP at large scales as well.)

  • Security
    • The sad truth of security is that the majority of security vulnerabilities come from sloppy programming. The sadder truth is that many schools do a poor job of training programmers to secure their code. Developers must be aware of the means by which a program can be compromised. They need to develop a sense of defensive programming--a mind for thinking about how their own code might be attacked. Security is the kind of training that is best distributed throughout the entire curriculum: each discipline should warn students of its native vulnerabilities.

  • Cryptography
    • Cryptography is what makes much of our digital lives possible. Modern students should understand and be able to implement the following concepts, as well as the common pitfalls in doing so

  • Software testing
    • Software testing must be distributed throughout the entire curriculum. A course on software engineering can cover the basic styles of testing, but there's no substitute for practicing the art. Students should be graded on the test cases they turn in. I use test cases turned in by students against all other students. Students don't seem to care much about developing defensive test cases, but they unleash hell when it comes to sandbagging their classmates.

  • User experience design
    • Programmers too often write software for other programmers, or worse, for themselves. User interface design (or more broadly, user experience design) might be the most underappreciated aspect of computer science. There's a misconception, even among professors, that user experience is a "soft" skill that can't be taught. In reality, modern user experience design is anchored in empirically-wrought principles from human factors engineering and industrial design. If nothing else, engineers should know that interfaces need to make the ease of executing any task proportional to the frequency of the task multiplied by its importance. As a practicality, every programmer should be comfortable with designing usable web interfaces in HTML, CSS and JavaScript.

  • Visualization
    • Good visualization is about rendering data in such a fashion that such that humans perceive it as information. This is not an easy thing to do. The modern world is a sea of data, and exploiting the local maxima of human perception is key to making sense of it.

  • Parallelism
    • Parallelism is back, and uglier than ever. The unfortunate truth is that harnessing parallelism requires deep knowledge of architecture: multicore, caches, buses, GPUs, etc. And, practice. Lots of practice.

  • Software engineering
    • The principles in software engineering change about as fast as the programming languages do. A good, hands-on course in the practice of team software construction provides a working knowledge of the pitfalls inherent in the endeavor. It's been recommended by several readers that students break up into teams of three, with the role of leader rotating through three different projects. Learning how to attack and maneuver through a large existing codebase is a skill most programmers will have to master, and it's one best learned in school instead of on the job.

  • Formal method
    • As the demands on secure, reliable software increase, formal methods may one day end up as the only means for delivering it. At present, formal modeling and verification of software remains challenging, but progress in the field is steady: it gets easier every year. There may even come a day within the lifetime of today's computer science majors where formal software construction is an expected skill. Every coder should be at least moderately comfortable using one theorem prover. (I don't think it matters which one.) Learning to use a theorem prover immediately impacts coding style. For example, one feels instinctively allergic to writing a match or switch statement that doesn't cover all possibilities. And, when writing recursive

  • Graphics and simulatio
    • There is no discipline more dominated by "clever" than graphics. The field is driven toward, even defined by, the "good enough." As such, there is no better way to teach clever programming or a solid appreciation of optimizing effort than graphics and simulation. Over half of the coding hacks I've learned came from my study of graphics.

  • Robotics
    • Robotics may be one of the most engaging ways to teach introductory programming. Moreover, as the cost of robotics continues to fall, thresholds are being passed which will enable a personal robotics revolution. For those that can program, unimaginable degrees of personal physical automation are on the horizon.

  • Artificial intelligence
    • If for no other reason than its outsized impact on the early history of computing, student should study artificial intelligence. While the original dream of intelligent machines seems far off, artificial intelligence spurred a number of practical fields, such as machine learning, data mining and natural language processing.

  • Machine learning
    • Aside from its outstanding technical merits, the sheer number of job openings for "relevance engineer," indicates that every student should grasp the fundamentals of machine learning. Machine learning doubly emphasizes the need for an understanding of probability and statistics.

  • Databases
    • Databases are too common and too useful to ignore. It's useful to understand the fundamental data structures and algorithms that power a database engine, since programmers often enough reimplement a database system within a larger software system. Relational algebra and relational calculus stand out as exceptional success stories in sub-Turing models of computation. Unlike UML modeling, ER modeling seems to be a reasonable mechanism for visualing encoding the design of and constraints upon a software artifact.


The wonderful and terrifying implications of computers that learn

Tedx Talks

This talk was given at a local TEDx event, produced independently of the TED Conferences. The extraordinary, wonderful, and terrifying implications of computers that can learn

Read Machine Learning


Comment Box is loading comments...

What is Code ?

What is code ? (Invisible Code)

Bloomberg Business

June 10 -- We know that when we enter code into a computer we get software. And we know that software is part of the fabric of our lives - it switches channels on our cable boxes or spits out money from an ATM. But how does all that actually happen? Bloomberg Businessweek author Paul Ford explains. (Animation by Bran Dougherty Johnson.)

Read Article