Computer Terminology Supercomputer and its Applications
Adware: It is a software package which automatically renders advertisements in order to generate revenue for its author.
Android: It is a Linux based operating system designed Primarily for touchscreen mobile devices such as smartphones and tablets computer.
Antivirus Software: Antivirus software consists of 3omputer programs that attempt to identify threat and eliminate computer virus and other malicious software (Malware)
Artificial Intelligence: Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today.
Bandwidth: The maximum amount of data that can travel in a communication path in a given time, measured in bits per second (bps).
Bar Code: A bar code is a machine-readable representation of information in a visual format on a surface. The first bar code system was developed by Norman Joseph Woodland and Bernard Silver in 1952.
Blog: It is a discussion or informational site published on the world wide web.
Bluetooth: A protocol that permits a wireless exchange of information between computers. cell phone and other electronic devices within a radius about 30 feet,
Cookie: A packet of information that travels between a browser and the web server
Debugging: It is a methodical process of finding and reducing the number of bugs, or defects, in a computer program or a piece of electronic hardware, thus making it behave as expected
Dots Per Inch (DPI): It is defined as the measure of the resolution of a printer, scanner or monitor. It refers to the number of dots in a one inch line. The more dots per inch, the higher the resolution.
DVD: DVD is an optical disk storage media format that can be used for data storage including movies with high quality video and sound
Encryption: In cryptography, encryption is the process of encoding messages (or information) in such a way that hackers cannot read it, but the authorised users can access it.
Graphic Interchange Format (GIF): A simple file format for pictures and photographs, that are compressed so they can be sent quickly.
JPEG: It is a commonly used method of lossy compression for digital photography. The term ‘JPEG’ is an acronym for the Joint Photographic Experts Groups.
HyperText Transfer Protocol (HTTP): It is an important protocol used on the world wide web for moving hypertext files across the internet. It requires an HTTP client program on one end and HTTP server program on other end.
Internet: The Internet (also known simply as the net is the worldwide, publicly accessible system of interconnected computer networks that transmit data by nacket switching using the standard Internet protocol
Internet Protocol (IP): Address IP addresses are assigned to each and every computer on a TCP/IP network. It ensures that data on a network goes where it is supported to go e.g., 192.168.2.250
Internet Service Provider(ISP): An Internet Service Provider (ISP) is a business organization that offers users access to the Internet and related services.
LAN: It stands for Local Area Network In a LAN the connected computers are geographically close together. They are either in the same building or within a smaller area.
Piracy: The illegal copying of software or other creative works.
Qwerty: It is one of the standard computer keyboard, with the character Q, W. E, R. T and Y on the top row of letters
Router: A network device that enables the network to reroute messages it receives that are intended for other networks. The network with the router receives the message and sends it on its way exactly as received. In normal operations. They do not store any of the messages that they pass through.
Scanner: An electronic device that uses light-sensing equipment to scan paper images such as text, photos and illustrations and translate the images into signals that the computer can then store, modify, or distribute
Spam: Irrelevant or unsolicited messages sent over Internet, typically to large numbers of users, for the purpose of advertising. Phishing spreading malwares, etc
Uniform Resource Locator (URL): The specific internet address for a resource such as an individual or an organization.
Virus: A piece of computer code designed as a prank or malicious act to spread from one computer to another by attaching itself to other programs.
Word Wide Web (‘www or ‘The Web’): A network of servers on the Internet that use hypertext-linked databases and files. It was developed in 1989 by Tim Berners-Lee, a British computer scientist and is now the primary platform of the Internet.
ZIP: It stands for Zone Information Protocol. This is an application that allows for the compression of application files.
Virtual Reality: An artificial environment created with computer hardware and software and presented to the user in such a way that it appears and feels like a real environment.To create this effect, the user needs Hardware devices like goggles, gloves, and earphones etc inbuilt with sensors.It enables people to deal with information more easily. VR provides a different way to see and experience information, one that is dynamic and immediate.
• A type of computer which is used to focus the applications that requires large mathematical and difficult calculations at frontline of processing capacity is known as Super computer.
• Supercomputers are key to new reforms in all societal sectors of the economy from manufacturing, defence, power, agriculture, communications, engineering sciences, banking, seismic sciences, weather and climate prediction, biotechnology, drug discovery, early warning for natural calamities, key data analysis, life sciences, data mining among others.
The applications of supercomputers have been discussed below:
• Recreating the Big Bang: The “Big Bang” or the initial expansion of all energy and matter in the universe, happened more than 13 billion years ago in trillion-degree Celsius temperatures, but supercomputer simulations make it possible to observe what went on during the universe’s birth. Researchers can run models that require upward of a thousand trillion calculations per second, allowing for the most realistic models of these cosmic mysteries yet.
• Understanding earthquakes: By modeling the three-dimensional structure of the Earth, researchers can predict how earthquake waves will travel both locally and globally. The resulting techniques can be used to map the subsurface for oil exploration or carbon sequestration, and can help researchers understand the processes occurring deep in the Earth’s mantle and core.
• Modeling swine flu: Potential pandemics like the H1N1 swine flu require a fast response on two fronts: First, researchers have to figure out how the virus is spreading. Second, they have to find drugs to stop it. Supercomputers can help with both. During the recent H1N1 outbreak, researchers at Virginia Polytechnic Institute and State University in Blacksburg, Va., used an advanced model of disease spread called EpiSimdemics to predict the transmission of the flu.
• Testing nuclear weapons: Computer simulations to ensure that the country’s cache of nuclear weapons is functional and safe. The real aim is to create better simulations of nuclear explosions and to do away with real-world nuke testing for good.
• Predicting climate change: The challenge of predicting global climate is immense. There are hundreds of variables, from the reflectivity of the earth’s surface. Dealing with these variables requires supercomputing capabilities. The resulting simulations both map out the past and look into the future. Models of the ancient past can be matched with fossil data to check for reliability, making future predictions stronger. New variables, such as the effect of cloud cover on climate, can be explored.
• Making possible more scientific advances: Supercomputing is needed for processing sophisticated computational models able to simulate the cellular structure and functionalities of the brain. This should enable us to better understand how our brain works and how we can cope with diseases such as those linked to ageing.
• More reliable decision-making: The world faces an increasing number of challenges at the local level as well as at the planetary scale. The convergence of HPC, Big Data and Cloud technologies will allow new applications and services in an increasingly complex scenario where decision-making processes have to be fast and precise to avoid catastrophes. Supercomputers are in the front line for developing essential public policies, from homeland security to climate action.
Different supercomputers are:
A. Tianhe-2 or TH-2 (“Milky Way 2”)
• It is a 33.86-petaflops supercomputer located in National Supercomputer Center in Guangzhou China.It is the world’s fastest supercomputer according to the TOP500 lists for June 2013, November 2013, June 2014 and November 2014.Tianhe-2 runs on Kylin Linux, a version of the operating system developed by NUDT. Resource management is based on Simple Linux Utility for Resource Management (SLURM).
B. Param Yuva II
• India is among few countries which highly invest in supercomputers research and development.Param Yuva 2 is Indian Super computer developed by C-DAC in 2013.It is an improved version of Other Param Series supercomputers.Yuva 2 is developed using hybrid technology, processor, coprocessor and hardware accelerators.
• Supercomputing is very important for all-round advancement of the country and can be used for capacity-building and advanced research and development.Param Yuva II will be used for research in space, bioinformatics, weather forecasting, seismic data analysis, aeronautical engineering, scientific data processing and pharmaceutical development.
• It’s a Japan’s fastest supercomputer developed by Japanese MNC Fujitsu and is active since 2011.K computer was based on Sparc chips designed by Sun Microsystems.It’s based distributed memory architecture and has achieved performance target of 10 petaflops.K computer is being used in a broad range of research fields including drug discovery, earthquake/tsunami research, weather forecasting, space science, manufacturing and material development.
NATIONAL SUPERCOMPUTING MISSION
• Over the years, supercomputers have benefited mankind in several ways. Thus to improve capabilities beyond current levels National Supercomputing Mission has been launched by the government to connect national academic and R&D institutions with a grid of over 70 high-performance computing facilities at an estimated cost of Rs 4,500 crore.
• These supercomputers will also be networked on the National Supercomputing grid over the National Knowledge Network (NKN).
• The NKN is another program of the government which connects academic institutions and R&D labs over a high speed network. Academic and R&D institutions as well as key user departments/ministries would participate by using these facilities and develop applications of national relevance.
• The Mission also includes development of highly professional High Performance Computing (HPC) aware human resource for meeting challenges of development of these applications.
• The mission would be implemented by the Department of Science and Technology and Department of Electronics and Information Technology (Deity) through Center for Development of Advanced Computing (C-DAC) and Indian Institute of Science (IISc), Bangalore.