Sunday, November 24, 2013

Artificial Intelligence and the Modern World

When the term "Artificial Intelligence" is brought up, the most common instant association that we think of is hyper intelligent robots that have become self aware and sentient and with their newly gained consciousness, they desire to wipe out humanity and destroy their creators.

That is the Hollywood version of artificial intelligence that has become widespread (although it is quite a fun one). The simplest way to describe an artificial intelligence is a set of determining commands that determine the behavior of some kind of mechanism, most commonly associated with is the computer.

The concept of artificial intelligence was aided by the hyper development of computer science in the last century.

A determining set of commands that execute to perform functions is the base idea of an artificial intelligence. taken to a super complex level, these commands can have so many functions and applications that they can perform high complex functions. Such an example would be the Mars rover Curiosity launched a few years back. A probe to Mars would have to be implemented with scientific functions that could collect and test material on the planet and relay that information back to us back on Earth.

In future applications artificial intelligences are sure to become more widespread. In fact, even now they are wide spread, but we don't know it. Our calculators are extremely primitive examples of artificial intelligences that can compute mathematical formulas. A more modern example would be the artificial intelligence of an enemy or non-player character in a video game. These models contain subroutines to be carried out if certain circumstances are met, such as attack, talk, sit, idle, and the list is immense.

The pop-culture notion of robots taking over the world is a hoax. It would take all the computing power of every computer in existence to even attempt to match the power of the human brain.

http://aitopics.org/misc/brief-history

Friday, November 15, 2013

Computer Science: Post 1900s



After developmental basis had reached a strong foundational level, for example the development of boolean and binary mathematics and some level of mechanical development that could perform some computations, it was then that the hardware and logic could be truly focused on and developed into computers and the computer language of logic.

During the 1940's, at the time of World War II, need for a electronical computational computer was immensely spurred by complex equations that concerned things like ballistic missile paths. Use of transmitting information and directions were also protected by encryption which militaries wanted to use to protect their transmissions and crack other militaries'.

In 1951 the first concept compiler, or an executable program which runs code, was created by Grace Murray Hopper. The compiler is an integral part of writing and running code. And around the same time was the development of the Turing Test by Alexander Turing which was one of the first attempts at creating an artificial intelligence.

In 1960, the term "Computer Science" was created by George Forsythe , a numerical analyst. Since then, computers and computer science has continued to grow in leaps and bounds year by year like clockwork.


Alexander Turing

https://cs.uwaterloo.ca/~shallit/Courses/134/history.html

Computer Science: Pre 1900s



The notion of computing mathematical computations has existed long in the past, such as the great Great pantheons, the great Roman Colosseum, and the Great Pyramids of Gaza. All these great structures would have been impossible to build without mathematical computations. And as time has progressed, more and more devices and methods with specified mathematical functions came into existence such as: Napier's Rod which simplifies multiplication, a mechanical adding machine by Blaise Pascal, and a loom that could weave intricate patterns by Joseph-Marie Jacquard.

Eventually from 1900 to 1939, the development of mathematics had developed to an unprecedented level ever seen in history. This exponential rise and development of mathematics and computation provided the basis for a system that would deal with these things, that is computer science. This rise spawned many of the building blocks that is used in computer science today, such as: boolean logic by George Boole, binary computation by Gottfried Wilhem Leibniz, and one of the first attempts at mechanical computation by Charlse Babbage.

Needless to say. These were some of the foundation blocks built in mathematics and computation that spawned the logic and methodology that computer science is founded and centered upon.



Sunday, November 10, 2013

P2P: Advantages and Disadvantages

What exactly is P2P? P2P is an acronym meaning, peer-to-peer and refers to a computer networking system in which multiple computer users share data with one another simultaneously in order to speed up the transfer process. We can see an example of everyday P2P networks in our homes with all of a family's computers linked to their home network.

Unlike the client-to-server networking system, the P2P network does not rely on a server, which is greatly slower than having multiple sources to download a file from.

Most P2P activity occurs in the internet, where the file sharing community can do much more than a home network. However, the P2P activity that occurs there often time is classified as illegal because users are not restrained to sharing only certain files.

P2P sharing also has one big flaw, in that there may be malicious people sharing files that contain hidden viruses or trojans; security is the one advantage client-to-server sharing has over P2P sharing.

However, there are some ways to protect yourself if you much prefer P2P and that is to read comments and stay away from files that have 0 sharers and 0 downloaders.




http://compnetworking.about.com/od/p2ppeertopeer/a/p2pintroduction.htm

Sunday, November 3, 2013

Data Structures

What exactly are data structures? To most outside the profession in computer science or engineering and some field of biology, a solid knowledge of what data structures are is rare to find. That's why it can be a great benefit to have a broad understanding of the concept, what it entails, and why it is relevant in today's age.

Simply put, data structures are systems or methods of data organization in computers in such a way as to make that data easy to access and be stored efficiently.

For large companies that deal with extremely large amounts of data as well as sensitive and private data, we can see that the ways in which data is stored becomes a very valued area of development and interest for them. For financial institutions private customer data like account information, credit card numbers, and records must be managed efficiently and safely.

Data structures are based on a very basic principle or stored at specific addresses in the computer and accessible data in the forms of integers or strings, that is letters, through RAM or a computer's capacity to handle data. That data is then manipulated by a set of written procedures in order for the user to achieve their desired objective.