What is scientific computing and what sets it apart from normal computing?
Scientific computing is highly realistic modeling of real world objects such as business models or algorithms that deal with data that can pertain to things like trends or societal interests using specific copmutational methods taht is dubbed "scientific computation". A common area that scientific computation can be seen is in business that need to track their finances, profit margins, and predict future trends that will affect their business.
Scientific computation is mathematical and informatical basis of a numerical simulation. Scientific computing reconstructs or predicts future trends or process that are used heavily in business, sciences, and engineering. This is the third way to obtain knowledge or data apart from theories and experiments.
In scientific computing, the objectives and goals depend on what the task is. It can be to reconstruct and understand certain situations like natural disasters, or optimize scenarios like improving the processing power and efficiency of a computer. And it can be used to predict scenarios such as weather and new materials.
Needless to say, scientific computing is an invaluable tool in a competitive economy and scientific community
http://www.brockport.edu/cps/whatis.html
http://www5.in.tum.de/lehre/vorlesungen/sci_comp/ws03/material/slides01.pdf
CS100W Technical Blog
This blog based on the review and research of modern day topics concerning technical topics such as the relation between business and the computer age and the internet, internet security, publicity through the internet, and social media and business.
Sunday, December 15, 2013
Sunday, December 8, 2013
Computer Graphics, From One Extreme End to Another
Computers are a completely new invention in relation to the history of man. But in such a short time of under a century, computers have made an unimaginably contribution to our lives and become integrated in such a way that was completely unforeseen.
One of the major ways we use computers is to play games on them, and because of this the evolution of computer graphics has exponentially increased its pace, starting from 8-bit character models to the game textures (or detail and resolution) of such degree that they are now photo-realistic thus making that virtual world you are playing much more immersive and enjoyable.
A short history of the major developments in computer graphics starts in 1998 where 16-bit depth with color and textures were developed. Then in 1999, multi-texture 32-bit rendering emereged. From there cube maps, texture compression, and antripsopic filtering. In 2001 programmable vertex, 3d textures, shadow maps, and multisampling. In 2002 early z-cull and dual-monitor capability was added. In 2003 fragment programs and color and depth compression. In 2004 flow control, and floating point textures, and valve texture control, in 2005 transparency antialiasing, in 2006 unifided shaders, geometry shaders, and in 2007 double precision.
That is to say, each of these things are incredibly complex, but in an extremely summarized version, developers continued to push the definition and detail of what they were able to incorporate into games and improved on software and hardware in computers that would help to render these images efficiently and quickly.
A 16-bit images compared next to current generation photorealistic graphics
One of the major ways we use computers is to play games on them, and because of this the evolution of computer graphics has exponentially increased its pace, starting from 8-bit character models to the game textures (or detail and resolution) of such degree that they are now photo-realistic thus making that virtual world you are playing much more immersive and enjoyable.
A short history of the major developments in computer graphics starts in 1998 where 16-bit depth with color and textures were developed. Then in 1999, multi-texture 32-bit rendering emereged. From there cube maps, texture compression, and antripsopic filtering. In 2001 programmable vertex, 3d textures, shadow maps, and multisampling. In 2002 early z-cull and dual-monitor capability was added. In 2003 fragment programs and color and depth compression. In 2004 flow control, and floating point textures, and valve texture control, in 2005 transparency antialiasing, in 2006 unifided shaders, geometry shaders, and in 2007 double precision.
That is to say, each of these things are incredibly complex, but in an extremely summarized version, developers continued to push the definition and detail of what they were able to incorporate into games and improved on software and hardware in computers that would help to render these images efficiently and quickly.
A 16-bit images compared next to current generation photorealistic graphics
Sunday, December 1, 2013
Computer Security Practicality
In the 21st century, wherever we go, it seems that we are always surrounded by a computer in one form or another. The most common devices that almost everyone keeps on their person at all times is a smartphone and a laptop. These devices commonly contain items of major importance to us, whether it be personal information, work related projects or data, or purchased virtual items may be contained on them. Thus it is of the utmost importance to keep our personal devices and the items kept in them safe from unwanted use.
Things that can damage the integrity of cyber security are malware, worms, and trojans which attempt to install themselves onto your computer and seek out sensitive information; typical targets include usernames and passwords for financial accounts and credit card information. Other threats include social network attacks which seek to lure users with fake links for personal information, and botnets which are compromised computers which seek to break computer encryptions.
http://wyoming.gov/pdf/brochure_security-is-important.pdf
http://www.cisco.com/web/strategy/docs/education/C45-626825-00_Cyber_Security_Responsibility_AAG.pdf
Things that can damage the integrity of cyber security are malware, worms, and trojans which attempt to install themselves onto your computer and seek out sensitive information; typical targets include usernames and passwords for financial accounts and credit card information. Other threats include social network attacks which seek to lure users with fake links for personal information, and botnets which are compromised computers which seek to break computer encryptions.
http://wyoming.gov/pdf/brochure_security-is-important.pdf
http://www.cisco.com/web/strategy/docs/education/C45-626825-00_Cyber_Security_Responsibility_AAG.pdf
Sunday, November 24, 2013
Artificial Intelligence and the Modern World
When the term "Artificial Intelligence" is brought up, the most common instant association that we think of is hyper intelligent robots that have become self aware and sentient and with their newly gained consciousness, they desire to wipe out humanity and destroy their creators.
That is the Hollywood version of artificial intelligence that has become widespread (although it is quite a fun one). The simplest way to describe an artificial intelligence is a set of determining commands that determine the behavior of some kind of mechanism, most commonly associated with is the computer.
The concept of artificial intelligence was aided by the hyper development of computer science in the last century.
A determining set of commands that execute to perform functions is the base idea of an artificial intelligence. taken to a super complex level, these commands can have so many functions and applications that they can perform high complex functions. Such an example would be the Mars rover Curiosity launched a few years back. A probe to Mars would have to be implemented with scientific functions that could collect and test material on the planet and relay that information back to us back on Earth.
In future applications artificial intelligences are sure to become more widespread. In fact, even now they are wide spread, but we don't know it. Our calculators are extremely primitive examples of artificial intelligences that can compute mathematical formulas. A more modern example would be the artificial intelligence of an enemy or non-player character in a video game. These models contain subroutines to be carried out if certain circumstances are met, such as attack, talk, sit, idle, and the list is immense.
The pop-culture notion of robots taking over the world is a hoax. It would take all the computing power of every computer in existence to even attempt to match the power of the human brain.
http://aitopics.org/misc/brief-history
That is the Hollywood version of artificial intelligence that has become widespread (although it is quite a fun one). The simplest way to describe an artificial intelligence is a set of determining commands that determine the behavior of some kind of mechanism, most commonly associated with is the computer.
The concept of artificial intelligence was aided by the hyper development of computer science in the last century.
A determining set of commands that execute to perform functions is the base idea of an artificial intelligence. taken to a super complex level, these commands can have so many functions and applications that they can perform high complex functions. Such an example would be the Mars rover Curiosity launched a few years back. A probe to Mars would have to be implemented with scientific functions that could collect and test material on the planet and relay that information back to us back on Earth.
In future applications artificial intelligences are sure to become more widespread. In fact, even now they are wide spread, but we don't know it. Our calculators are extremely primitive examples of artificial intelligences that can compute mathematical formulas. A more modern example would be the artificial intelligence of an enemy or non-player character in a video game. These models contain subroutines to be carried out if certain circumstances are met, such as attack, talk, sit, idle, and the list is immense.
The pop-culture notion of robots taking over the world is a hoax. It would take all the computing power of every computer in existence to even attempt to match the power of the human brain.
http://aitopics.org/misc/brief-history
Friday, November 15, 2013
Computer Science: Post 1900s
After developmental basis had reached a strong foundational level, for example the development of boolean and binary mathematics and some level of mechanical development that could perform some computations, it was then that the hardware and logic could be truly focused on and developed into computers and the computer language of logic.
During the 1940's, at the time of World War II, need for a electronical computational computer was immensely spurred by complex equations that concerned things like ballistic missile paths. Use of transmitting information and directions were also protected by encryption which militaries wanted to use to protect their transmissions and crack other militaries'.
In 1951 the first concept compiler, or an executable program which runs code, was created by Grace Murray Hopper. The compiler is an integral part of writing and running code. And around the same time was the development of the Turing Test by Alexander Turing which was one of the first attempts at creating an artificial intelligence.
In 1960, the term "Computer Science" was created by George Forsythe , a numerical analyst. Since then, computers and computer science has continued to grow in leaps and bounds year by year like clockwork.
Alexander Turing
https://cs.uwaterloo.ca/~shallit/Courses/134/history.html
Computer Science: Pre 1900s
The notion of computing mathematical computations has existed long in the past, such as the great Great pantheons, the great Roman Colosseum, and the Great Pyramids of Gaza. All these great structures would have been impossible to build without mathematical computations. And as time has progressed, more and more devices and methods with specified mathematical functions came into existence such as: Napier's Rod which simplifies multiplication, a mechanical adding machine by Blaise Pascal, and a loom that could weave intricate patterns by Joseph-Marie Jacquard.
Eventually from 1900 to 1939, the development of mathematics had developed to an unprecedented level ever seen in history. This exponential rise and development of mathematics and computation provided the basis for a system that would deal with these things, that is computer science. This rise spawned many of the building blocks that is used in computer science today, such as: boolean logic by George Boole, binary computation by Gottfried Wilhem Leibniz, and one of the first attempts at mechanical computation by Charlse Babbage.
Needless to say. These were some of the foundation blocks built in mathematics and computation that spawned the logic and methodology that computer science is founded and centered upon.
Sunday, November 10, 2013
P2P: Advantages and Disadvantages
What exactly is P2P? P2P is an acronym meaning, peer-to-peer and refers to a computer networking system in which multiple computer users share data with one another simultaneously in order to speed up the transfer process. We can see an example of everyday P2P networks in our homes with all of a family's computers linked to their home network.
Unlike the client-to-server networking system, the P2P network does not rely on a server, which is greatly slower than having multiple sources to download a file from.
Most P2P activity occurs in the internet, where the file sharing community can do much more than a home network. However, the P2P activity that occurs there often time is classified as illegal because users are not restrained to sharing only certain files.
P2P sharing also has one big flaw, in that there may be malicious people sharing files that contain hidden viruses or trojans; security is the one advantage client-to-server sharing has over P2P sharing.
However, there are some ways to protect yourself if you much prefer P2P and that is to read comments and stay away from files that have 0 sharers and 0 downloaders.
http://compnetworking.about.com/od/p2ppeertopeer/a/p2pintroduction.htm
Unlike the client-to-server networking system, the P2P network does not rely on a server, which is greatly slower than having multiple sources to download a file from.
Most P2P activity occurs in the internet, where the file sharing community can do much more than a home network. However, the P2P activity that occurs there often time is classified as illegal because users are not restrained to sharing only certain files.
P2P sharing also has one big flaw, in that there may be malicious people sharing files that contain hidden viruses or trojans; security is the one advantage client-to-server sharing has over P2P sharing.
However, there are some ways to protect yourself if you much prefer P2P and that is to read comments and stay away from files that have 0 sharers and 0 downloaders.
http://compnetworking.about.com/od/p2ppeertopeer/a/p2pintroduction.htm
Subscribe to:
Posts (Atom)