Computer Science is a hard discipline to learn. But, if you are motivated and devote sufficient time to studying the discipline, then it is possible to learn Computer Science. Initially Computer Science seems hard because learning to program is challenging. ... However, most of people learn skills step-by-step over time.
Computer science, the study ofcomputersand computing, including their theoretical and algorithmic foundations,hardwareandsoftware, and their uses for processing information. Thedisciplineof computerscienceincludes the study ofalgorithmsand data structures, computer and network design, modeling data and information processes, andartificial intelligence. Computer science draws some of its foundations frommathematicsandengineeringand therefore incorporates techniques from areas such as queueing theory,probability and statistics, andelectroniccircuit design. Computer science also makes heavy use ofhypothesis testingand experimentation during the conceptualization, design, measurement, and refinement of new algorithms, information structures, and computer architectures.
Computer science is considered as part of a family of five separate yet interrelated disciplines: computer engineering, computer science, information systems, information technology, and software engineering. This family has come to be known collectively as the discipline of computing. These five disciplines are interrelated in the sense that computing is their object of study, but they are separate since each has its own research perspective and curricular focus. (Since 1991 the Association for Computing Machinery [ACM], the IEEE Computer Society [IEEE-CS], and the Association for Information Systems [AIS] have collaborated to develop and update the taxonomy of these five interrelated disciplines and the guidelines that educational institutions worldwide use for their undergraduate, graduate, and research programs.)
The major subfields of computer science include the traditional study of computer architecture, programming languages, and software development. However, they also include computational science (the use of algorithmic techniques for modeling scientific data), graphics and visualization, human-computer interaction, databases and information systems, networks, and the social and professional issues that are unique to the practice of computer science. As may be evident, some of these subfields overlap in their activities with other modern fields, such as bioinformatics and computational chemistry. These overlaps are the consequence of a tendency among computer scientists to recognize and act upon their field’s many interdisciplinary connections.
Ø Computer science is the study of computers and computing concepts. It includes both hardware and software, as well as networking and the Internet. Ø Computer Science is the study of computers and computational systems. Unlike electrical and computer engineers, computer scientists deal mostly with software and software systems; this includes their theory, design, development, and application. Ø Principal areas of study within Computer Science include artificial intelligence, computer systems and networks, security, database systems, human computer interaction, vision and graphics, numerical analysis, programming languages, software engineering, bioinformatics and theory of computing. Ø Although knowing how to program is essential to the study of computer science, it is only one element of the field. Computer scientists design and analyze algorithms to solve programs and study the performance of computer hardware and software. The problems ...
Computer science emerged as an independent discipline in the early 1960s, although the electronic digital computer that is the object of its study was invented some two decades earlier. The roots of computer science lie primarily in the related fields of mathematics, electrical engineering, physics, and management information systems. In the past sixty years or so, computers have migrated from room-size mega boxes to desktops to laptops to our pockets. But the real history of machine-assisted human computation (“computer” originally referred to the person, not the machine) goes back even further. But the modern computing-machine era began with Alan Turing's conception of the Turing Machine, and three Bell Labs scientists invention of the transistor, which made modern-style computing possible and landed them the 1956 Nobel Prize in Physics.
Ø Consider how you use a computer in a typical day. For example, you start working on a report, and once you have completed a paragraph, you perform a spell check. You open up a spreadsheet application to do some financial projections to see if you can afford a new car loan. You use a web browser to search online for the kind of car you want to buy. Ø You may not think about this very consciously, but all of these operations performed by your computer consist of algorithms. An algorithm is a well-defined procedure that allows a computer to solve a problem. Another way to describe an algorithm is a sequence of unambiguous instructions. The use of the term 'unambiguous' indicates that there is no room for subjective interpretation. Every time you ask your computer to carry out the same algorithm, it will do it in exactly the same manner with the exact same result. Ø Consider the earlier examples again. Spell checking uses algorithms. Financial calc...
Comments
Post a Comment