Thursday, March 31, 2011

Information Panopticon & The Development and Convergence of ICTs in The Organization of Work


Abstract:
              I argue in this blog about Information Panopticon & the development and convergence of information and communication technologies(ICT),how its use in the organization of work as an example, is creating a global network of surveillance capabilities which affect the traveller. These surveillance capabilities are reminiscent of 18th century philosopher Jeremy Bentham’s panopticon, and as such the emerging global surveillance network has been referred to as the ‘travel panopticon’.
History of Information Panopticon – Some Important Aspects:           
           
Jeremy Bentham was a philosopher and political radical who influenced the development of liberalism. Bentham was a significant figure in 19th century Anglo-American philosophy, but he is perhaps best known as an early advocate of utilitarianism. Indeed, his secretary and collaborator was James Mill, the father of John Stuart Mill—with whom utilitarianism is synonymous.
              Among Bentham’s insights and ideas was a plan for a multi-purpose disciplinary facility where the need for supervision was paramount. The primary function of the facility was as a prison or penitentiary, but Bentham thought the design equally applicable to schools, hospitals, ‘‘mad houses’’, and factories. In fact it was his brother Samuel’s efficient factory design which formed the basis for Bentham’s idea.


              The physical details of the panopticon are perhaps best given by paraphrasing Bentham himself: “The building is circular. The prisoners’ cells occupy the circumference, and are divided from each other to prevent all communication. The room for the inspector, or chief warden, occupies the centre of the building. Light is provided by a window in each cell, but the inspector’s room is designed in such a way that no direct light penetrates it from the perspective of the prisoners"

             This last point is crucial, as it enables the power of seeing without being seen—one of the essential ‘‘qualities’’ of the panopticon. As Bentham noted, this has the effect of creating the idea of surveillance in the prisoners’ minds and has the benefit that the inspector need not be in the central inspection room at all times. Such a quality of apparent omnipresent surveillance also has the consequence of creating a ‘‘chilling effect’’, where not only are the prisoners’ behaviours modified by the very course of incarceration, but that the prisoners also participate in their own self-modification. Michel Foucault wrote further on this in his work ‘Discipline and Punish: The Birth of the Prison’, where he argued that invisible power gazes relentlessly upon society’s citizens, and puts them under such intense scrutiny that they become persuaded to participate in their own subjection.

ICTs - an Introduction:
              Information & Communication Technology
usually called ICT, is often used as an extended synonym for information technology (IT) but is usually a more general term that stresses the role of unified communications and the integration of telecommunications  (telephone lines and wireless signals), intelligent building management systems and audio-visual systems in modern information technology.
Use of ICTs -  The Organization of Work & The Global Network of Surveillance:
             
Information and communication technologies (ICTs) are reshaping many industries, often by reshaping how information is shared. Information-intensive industries, by their nature, show the greatest impacts due to ICTs that enable information sharing and the bypassing of traditional information intermediaries. However, while the effects and uses of ICT are often associated with organizations (and industries), their use occurs at the individual level. In other words, it is changes to individual work related to the use of ICTs that reshape both organization and industry structures, and viceversa.

              The now well established ICT paradigm is one based on developments in digital computing technologies combined  with advances in telecommunications capabilities. On top of this paradigm is emerging a whole range of new technological developments and applications—technologies which are enabling unprecedented convergence of hitherto disparate fields of human endeavour.  In particular, the convergence of these technologies is  having a profound effect on the policies and practices which nation states use for border control. This effect is transforming previously largely passive, incompatible, and primitive systems of border control into coordinated, sophisticated, and active systems of people tracking.This transformation has accelerated subsequent to the events in  the United States of September 11, 2001, and is resulting in the emergence of a seamless, ubiquitous, and continuous  form of travel surveillance.
              The travelling public are generally unaware of the scale, depth, or sophistication of this travel surveillance. They are also often unaware that they are under surveillance at all, let alone by whom they are being surveilled. Such surveillance characteristics are reminiscent of 18th century Jeremy Bentham’s panopticon, and as such the phenomenon of systematic global mass travel surveillance has been referred to as the travel panopticon.
              Bentham’s panopticon was an architectural plan intended (primarily) for a prison building where the prison’s guard could observe all prisoners, but where the prisoners could not see the guard. The travel panopticon has much of this same functionality, albeit in a more complex and multifaceted form. Its overall effect is one of ICT enabled surveillance of people movement on a massive and global scale as well as at increasingly fine levels of segmentation. The emergence of the travel panopticon raises profound ethical, political, and societal questions,—namely what effect is this emergence having on personal autonomy which is an essential part of any canonical account of moral philosophy, and it is also one of the most important elements in the political tradition of liberalism.
              On the other hand, these Trusted traveller’ or ‘fast track’ schemes such as Passenger data sharing agreements
, Radio frequency identification (RFID- sensor and scanner technologies that use radio waves to identify people or objects automatically) etc., encourage the uptake in the use of biometrics and RFIDs with the promise of faster queuing times and reduced ‘hassle’ at airports.
Conclusion:
               
In this post, I tried to argue of technology enabling the understanding and conveying of information expands in speed, efficiency and boundary-spanning daily. This, in turn, may throw entire industries into turmoil, some roles becoming redundant while completely new ones emerge. New approaches to work, knowledge, information, ICT and organization structure are essential prerequisites to survive in this new environment.  
              Common to all the ‘trusted traveller’ schemes mentioned above is their two level method of operation. Their ‘public face’ is the potential to reduce waiting times at airports and other border areas and to reduce the ‘hassle factor’ in negotiating the often long lines of people and myriad of forms necessary to enter or exit a country. These technological and informational practices have in turn been complimented by laws and international agreements diluting important aspects of human rights and human rights conventions. All of these initiatives have the potential in and of themselves to reduce personal autonomy, but it is in their combined application that the true surveillance capability of the travel panopticon is realised and the real consequences for personal autonomy and human dignity emerge.


References:
(i)  Wikipedia  - for getting some of the definitions of terms.

(ii)  Semple, J. (1993). ‘Bentham’s prison: A study of the panopticon penitentiary’. Oxford: Clarendon Press.
(iii) ‘Personal autonomy in the travel panopticon’ an article by Eamon Daly Springer Science+Business Media B.V. 2009.

(iv) ‘Dilemmas of Transformation in the Age of the Smart Machine’ by Shoshana ZuboJJ.

                                                                                                                                   Submitted by: Syed Ashruf,
                                                                                                                                                               AE09B025.

"What is the Information Panopticon? Discussed with reference to the use of ICT in the organization of work since the 80's." – survellience state


Information panopticon represents a form of centralized power that uses information and communication technology (ICT) as observational tools and control mechanisms. It has originated from the term Panopticon, a type of prison building designed by English philosopher and social theorist
Jeremy Bentham.

The concept of the design is to allow an observer to observe (-opticon) all (pan-) prisoners without the incarcerated being able to tell whether they are being watched. 
Within The Information Panopticon, Zuboff uses the architectural strategies of the panopticon as a metaphor to describe how information systems translate, record, and display human behavior. 

The information panopticon critiques how technological systems use transparency to assert power, control, and authority over us.

As these ICTs are introduced into the workplace, these information centers help managers to revamp their methods of communication, invite feedback, listen, coach, facilitate, manage many objectives, encourage autonomy, provide vision. In other words, technology can be used as a form of power that displays itself automatically and continuously.

Government is creating a surveillance state ?
In many developed countries the evidence is growing by the week that the government is creating a surveillance state.
It has a database containing the international travel records of all citizens.

In addition, the records of all children are to be held on, a national ID database is currently being developed, all health records currently held by GPs will be centrally available and a database of DNA profiles, ostensibly for criminals. 
Meanwhile, the ubiquitous CCTV cameras in every public space make personal privacy increasingly hard to maintain.
Even in the name of countering crime or combating terrorism, why should the state know where you are going, where you have been and whom you call while watching everyone’s movements on camera?

This is how the state(government) surrvillences its citizens.

Gokul
CH09B065

Information Panopticon


The information panopticon represents a form of centralized power that uses information and communication technology (ICT) as observational tools and control mechanisms. English philosopher and social theorist Jeremy Bentham developed the original architecture of the panopitcon as a prison. The idea was that "total" surveillance would eventually eliminate undesired behaviour. While Bentham's idea was literal, it has since become a metaphor for any type of system in which surveillance can or is total. The information panopticon represents a form of centralized power that uses information and communication technology (ICT) as observational tools and control mechanisms. The combination of information and communication technology used in call centres means it is possible for managers to monitor remotely worker performance and productivity, embodying the themes of panopticon control and surveillance. This monitoring ranges from keystroke counting, telephone service observation whereby statistics are gathered on the duration, time between, and number of calls, telephone call accounting, "peeking" on to workers computer screens and into electronic mail, to the use of "active" or "magic" badges that can keep track of an employee's movements and locations. Increasingly, computers are being used to set tasks and performances for all levels of worker.The responsibility of this technical authority does begin to question what ethical, social, and professional surveillance is acceptable in response to ICT technology in the workplace. Surveillance in the work place is not necessarily new; it has long been around in the form of corporate policy, collective behavior and social traditions. Zuboff describes how maintaining faith that under girds imperative control is hard work psychologically demanding, time-consuming, and inevitably prone to ambiguity (Zuboff 360). The capacity of these surveillance systems will accomplish some goals, and create entirely new unresolved problems: what to do with all of this personal data? Similar to the Panoptic prison, the information panopticon does focus on creating a vulnerable, defenseless user. However, the employees are not prisoners, they are not without some sense of control, and certainly should question the business practices. The fight remains within the users, the employees, to not passively participate in surveillance but rather to actively place responsibility on management and administration to effectively organize. As ICTs continue to act as control mechanisms within the workplace, management should tirelessly redevelop systems that respond not only to power but also the emotional, the personal, and complexity of human behaviors.Today, close to a billion workers are surveilled electronically in workplaces all over the world. Management often views surveillance as an attempt to achieve certain organizational goals better by more fully utilizing time and other resources. But from the employees’ perspective, surveillance can also be understood as an attempt to create new power relationships based on an electronic version of Bentham's panopticon. 

- N.HARDEV
CH09B066

Information Panopticon


At the end of World War II, the electronic digital computer technology we take for granted today was still in its earliest infancy. It was expensive, failureprone,and ill-understood. Digital computers were seen as calculators, useful primarily for accounting and advanced scientific research.The information panopticon represents a form of centralized power that uses information and communication technology (ICT) as observational tools and control mechanisms. English philosopher and social theorist Jeremy Bentham developed the original architecture of the panopitcon as a prison. Be it a personal computer or a database system, both promote forms of interconnectivity that require a centralized control centre. The physical location of this centre is analogous to the central surveillance tower of the panopticon .  As these ICTs are introduced into the workplace, these information centers help managers to revamp their methods of communication, invite feedback, listen, coach, facilitate, manage many objectives, encourage autonomy, provide vision. In other words, technology can be used as a form of power that displays itself automatically and continuously.This places the employees in a position of passive and obedient, where they no longer know or understand exactly how panoptic power is being enforced. Consequently, the administrative actions within the workplace can appear paranoid and non-specified approaches to security.The information panopticon represents a form of centralized power that uses information and communication technology (ICT) as observational tools and control mechanisms. English philosopher and social theorist Jeremy Bentham developed the original architecture of the panopitcon as a prison. The responsibility of this technical authority does begin to question what ethical, social, and professional surveillance is acceptable in response to ICT technology in the workplace. Surveillance in the work place is not necessarily new; it has long been around in the form of corporate policy, collective behavior and social traditions. Zuboff describes how maintaining faith that under girds imperative control is hard work psychologically demanding, time-consuming, and inevitably prone to ambiguity (Zuboff 360). The capacity of these surveillance systems will accomplish some goals, and create entirely new unresolved problems: what to do with all of this personal data? Similar to the Panoptic prison, the information panopticon does focus on creating a vulnerable, defenseless user. However, the employees are not prisoners, they are not without some sense of control, and certainly should question the business practices. The fight remains within the users, the employees, to not passively participate in surveillance but rather to actively place responsibility on management and administration to effectively organize. As ICTs continue to act as control mechanisms within the workplace, management should tirelessly redevelop systems that respond not only to power but also the emotional, the personal, and complexity of human behaviors.

-
R.SURENDER NAIK
CH09B071

"What is the Information Panopticon? Explain with reference to the use of ICT in the organization of work since the 80's."


ABSTRACT:
This article Discuses about Information Panopticon and enhances the knowledge by taking the example of surveillance of speeding vehicles by using of cameras by the police.
INTRODUCTION:
The information panopticon represents a form of centralized power that uses information and communication technology (ICT) as observational tools and control mechanisms
HISTORY:
Prisons were the nucleus of the present surveillance system. Prisons were built to confine “criminals” which were defined by their judiciary. These prisoners should be under scan every time,so surveillance was inevitable. Also to remove the dishonesty, idleness of human agents, the surveillance has even entered industries. Slowly it had entered “Discipline” needed educational institutions. Surveillance now have even entered reality shows for fun (Big Brother/Big Boss are shows where live in a house with no connection to the outside and surveillance is done by CCTV) .But the Surveillance technique can be used for something very important like preventing crime.
INFORMATION PANOPTICON:
The information panopticon represents a form of centralized power that uses information and communication technology (ICT) as observational tools and control mechanisms. English philosopher and social theorist Jeremy Bentham developed the original architecture of the panopitcon as a prison. The structure consisted of a centralized tower surrounded by a circular building divided into prison cells. Benthams concept was to maximize the number of prisoners that can be observed by one individual within the tower. Within The Information Panopticon, Zuboff uses the architectural strategies of the panopticon as a metaphor to describe how information systems translate, record, and display human behaviorwith a degree of illumation that would have exceeded even Benthams most outlandish fantasies. The information panopticon critiques how technological systems use transparency to assert power, control, and authority over users.
Inherent within many new technological and informational devices is the ability to network. Whether it is a personal computer or database systems, these applications often promote forms of interconnectivity that require a centralized control center . In early telecommunication experiments by inventors like Alexander Bell and Samuel Morse, the idea of transmission was essentially linear. A message was sent from one location to another, traveling down a wire. As innovation progressed, communication began to operate through various access nodes within a network. The physical locations of the switching and control centers began to operate in very similar ways to the central surveillance tower of the panopticon.
As these ICTs are introduced into the workplace, managers and employees are discovering the hierarchical risks within information authority. Zuboff explains that these information centers help managers in a workplace to revamp their methods of communication, invite feedback, listen, coach, facilitate, manage many objectives, encourage autonomy, provide vision .The engagements a manager previously dealt with in a face-to-face setting can now be administered through a system that operates in a ubiquitous way. In other words, technology can be used as a form of power that displays itself automatically and continuously
In a work setting, this method of control is different to that of the original panopticon because many ICT systems function as transparent architectures. The technological knowledge needed to understand how one is being surveyed is not as apparent as in Benthams prison. The techniques of control within informational and networked systems often appear pragmatic, immediate, and technical .This places the employees in a position of passive and obedient, where they no longer know or understand exactly how panoptic power is being enforced. Consequently, the administrative actions within the workplace can appear paranoid and non-specified approaches to security.
The responsibility of this technical authority does begin to question what ethical, social, and professional surveillance is acceptable in response to ICT technology in the workplace. Surveillance in the work place is not necessarily new; it has long been around in the form of corporate policy, collective behavior and social traditions. Zuboff describes how maintaining faith that undergirds imperative control is hard work psychologically demanding, time-consuming, and inevitably prone to ambiguity. The capacity of these surveillance systems will accomplish some goals, and create entirely new unresolved problems: what to do with all of this personal data? Similar to the Panoptic prison, the information panopticon does focus on creating a vulnerable, defenseless user. However, the employees are not prisoners, they are not without some sense of control, and certainly should question the business practices. The fight remains within the users, the employees, to not passively participate in surveillance but rather to actively place responsibility on management and administration to effectively organize. As ICTs continue to act as control mechanisms within the workplace, management should tirelessly redevelop systems that respond not only to power but also the emotional, the personal, and complexity of human behaviors.

EXAMPLE OF CAMERAS FOR SPEED CONTROL
Computering is used in speed cameras detecting the speed of a vehicle
it is either done by sending a laser or radar beam at the passing vehicle. The beam is then returned back to the speed camera equipment, providing an exact speed or by using loops in the road, if the passing vehicle drives too fast over the loops, the speed camera is triggered.
All methods involve technology and help the police force give speeding tickets to the people who have broken the speeding limit law.By the use of these cameras it is helping drivers realise they must slow down or their driving licenses will be taken, they will recieve points on their licence or they will recieve a fine which all helps decrease the chances of the speeding limit laws being broken.


BOTTOM LINE
Speed cameras mainly have benefits (catching speeding vehicles therefore reducing the number of people speeding in the future). This system helps the police because “we can observe more number of people at the same time and can punish more no.of law breakers” which was the priniciple or Jeremy Bentham’s architecture of prison to view more no.of prisoners by one observer and help police to “DOMINATE ” by making public’s traffic transparent by using cameras

B.Vamshi
EE09B104


Wednesday, March 23, 2011

Risks and Responsibilities


   Whirlwind, under the direction of engineer Jay Forrester, actually began in 1944 as an analog computer for use in a flight simulator, funded by the Navy. News about the ENIAC and EDVAC digital computer projects led Forrester to abandon the analog approach in early 1946.But the original application goal of a flight simulator remained.In theory flight simulators were, and remain, what is known as a “dual-use” technology, equally useful for training military and civilian pilots.It is important to emphasize that at this historical juncture these were not obvious goals for a digital computer. Analog computers and control mechanisms(servomechanisms) were well-developed, with sophisticated theoretical underpinnings. (Indeed, Forrester began his work at MIT as a graduate student in Gordon Brown’s Servomechanisms Laboratory.) Analog controllers did not require the then-complex additional step of converting sensor readings into numerical form and control instructions into waveforms or other analog signals (Valley 1985).  Mechanical or electro-mechanical devices were inherently  slower than electronic ones, but there was no inherent reason why electronic computers or controllers should be digital, since many electronic components have analog properties. Numerous electronic analog computers were built duringand after the war. Most other projects saw electronic digital computers as essentially giant calculators, primarily useful for scientific computation. Their size, their expense, and this vision of their function led many to believe that once perfected only a few — perhaps only a couple — of digital computers would ever be needed.  Even Forrester at one time apparently thought that the entire country would eventually be served by a single gigantic computer.In the contemporary context  , the alarm warning systems for missile poses a lot of risks for the world . the greatest risk is when the system gives false alarm and thus may lead to more misunderstanding and war . In 1983 ,Lt. Colenel Stanislav Petrov was the officer incharge of the warning system monitoring station outside Moscow for Soviet Union . One fine warning he got signals indicating 5 ballistic missiles have been launched by US .The tensions between the states was already present and this false alarm could have resulted in a war .Petrov thought that had America launched a nuclear attack against Russia ,missiles would have been raining down not just 5 thus considered that it is a false signal ,thus saving the world of another devastating war .The Whirlwind computers developed during cold war is a classic example o risk involved in any systems The duplex computer enables us to keep the data safe even if one of the computers gets damaged or destroyed the other can take over . this kind of computer systems is used in banks all across the globe .
    Therefore we can say that technology is directly proportional to the risk.

-
R.SURENDER NAIK
CH09B071

Monday, March 21, 2011

Need of National Security during the Cold War period - Risks & Responsibilities with the Space Race


INTRODUCTION:
Computers were used for various operations in terms of extensive calculations (incybernics) during World-War 2, and US became the super power by this very idea of anti-aircraft weaponry and artillery after the World War 2

CLOSED WORLD DISCOURSES & CYBORG’S DISCOURSES:
After World War 2 there created a closed world which was like putting every thought, word and action into reaching a central goal, which was regarding the national security during the cold war period.
Cold War was referred to as the period after the World War 2 which was a silent conflict between the US and USSR regarding the military and the technology and science that determined their supremacy over the nations.
During this very period, digital computers were developed and were extensively used for the fast calculations of ballistics employed in anti-aircrafts weaponry and also for a wide and over-all surveillance of the nation which was considered very necessary during this time as a concern of national security.

SPACE RACE – RISKS &RESPONSIBILITIES:
              In one of the silent conflicts during Cold War the Space Race between US and USSR which referred to as the race or fight between these countries for the ultimate supremacy over the outer space exploration which was regarded as a great threat to the national security.
              The other reason for this very fight was the reputation one might get if one win the race over the other which dealt with the altitude of the US then.
So, the risks involved during this research were also of concern because in this very fight between the nations, they were fast enough to produce space vehicles but lacked efficiency and safety of the people concerned to travel in these vehicles.
This means they neglected the danger involved in such actions as getting the best technology was the only concern in this race. Because of this there were many accidents of space vehicles like Appolo 1

CONCLUSION:
              Even though this space race was a responsibility towards the national security and also an action to get supremacy to the nations they were not concerned about the responsibility towards the people life in this very race and the risks involved was not given preference.
              The reason for not considering the risk can be attributed to the giving much preference to national security, patriotism.
                                                                                                                                    Submitted by: Syed Ashruf,
                                                                                                                                                                              AE09B025.

The development and further research on computer was associated with perceived need of national security during the cold war. Discussed with contemporary example of cyber war - hacking – wikileaks.


After world war II there was a period of tension, proxy wars between the new superpowers of the world.  During this period one has to keep eye on the other in order to be alert and secured and be prepared for whatever the comes.
With the development in aircraft and submarines, an attack was possible through any way.  The radars, tracking devices, warning system were installed all over these countries to keep an eye on possibility of an enemy attack. All these system generated lots of data which need it be viewed and analyzed.  The human computers were not capable of handling these huge amount of data. So some one came up with ideas handling these data with a mechanical/digital computer.
We clearly see that SAGE project which needed analysis of data and ballistic calculations which were done by a twin computer working in tandem, to do these calculations. This made calculations easy and hence gave a upper hand to America in defense system. After the was depended on the so called digital computers.
People began to make smarter and smarter computers to strengthen their defense.
In the modern digital world computer performs lots of other work but we have to remember that these were developed as a part of defense system.
In modern times all this data from part another part travels through wire – i.e. Internet.  This has made lives of many easier,  the banks and cooperate sectors have gone through a tremendous change these years .’ who would thought of a idea of wired cash I 1900?’
But we should not forget to the dark side or the risk involved in it.  All the data sent through this networks are they secure?  The answer is ‘no’ . there is risk at the data sent over a network can be interrupted by other person and taken advantages.

Here comes the role of a hacker. Hacker is a person with high knowledge and knows the backdoors of a network.
This type of hacker in to others system/network and getting benefited by their data is very common.

The recent popular wikileaks hacked the cables sent by US diplomats to Washington.  And release these cables created huge tension among nations.

There is another kind of hacking groups who call themselves cybersoilders. The hack in to other countries important websites and important data bases.
Such as the defense, nuclear reactor system.
These cyber war among counties have been there for few years with growing no of computers a cyber waR on a country can destabilize their economy , defense system etc.

Recently a group called Indian cyber army hacked in to Pakistan defense and ministry websites as a reaction for Mumbai attack. The Pakistan cyber army pakbugs attacked back CBI websites.
With these type of cyber  war the countries have improve not only their defense but also the computers and database security.


Gokul Krishna 
CH09B065

Risks and Responsibilities in ATC

The development and further research on Computers was associated with the perceived need of national security during the Cold War period.
Discuss, with contemporary examples some of the risks and responsibilities involved with such justifications.

After the World War II, USA and Russia raised as the new Super Powers and in the name of National Security conducted further research
in the field of science, and created many weapons, thus raising the tensions between the two countries resulting in Cold War.

As a result, America invented the SAGE system,which detects any hostile aircrafts in the borders of the nation and immediately able
to send recruits to counter them.All this was done by the use of double-computers,the other one is used for backup in the case where
one computer fails.This system is the first one of its kind.Military Personnel will constantly observe the skies on their respective
screens and detects any unauthorised aircraft flowing in the air, by some sort of code.
Vannevar Bush says that -
"This war emphasizes three facts of supreme importance to national security: (1) Powerful new tactics of defense and offense are developed around new weapons created
by scientific and engineering research... (3) war is increasingly total war, in which the armed services must be supplemented by active participation of every
element of civilian population. To insure continued preparedness along farsighted technical lines, the research scientists of the country must be called upon to
continued in peacetime some substantial portion of those types of contribution to national security which they have made so effectively during the stress of the
present war (Bush 1945, p. 12)".

One contemporary example of such sort would be the Air Traffic Control System stationed at every country's airports(usually),some
countries even use the system for defense purposes like the SAGE system.
Air traffic control (ATC) is a service provided by ground-based controllers who direct aircraft on the ground and in the air. The primary purpose of ATC systems
worldwide is to separate aircraft to prevent collisions, to organize and expedite the flow of traffic, and to provide information and other support for pilots
whenever able.
Preventing collisions is referred to as separation, which is a term used to prevent aircraft from coming too close to each other by use of lateral, vertical and
longitudinal separation minima; many aircraft now have collision avoidance systems installed to act as a backup to ATC observation and instructions. In addition to
its primary function, the ATC can provide additional services such as providing information to pilots, weather and navigation information and NOTAMs (NOtices To
AirMen).
The primary method of controlling the immediate airport environment is visual observation from the airport traffic control tower (ATCT). The ATCT is a tall, windowed structure located on the airport grounds. Aerodrome or Tower controllers are responsible for the separation and efficient movement of aircraft and vehicles operating on the taxiways and runways of the airport itself, and aircraft in the air near the airport
The areas of responsibility for ATCT controllers fall into three general operational disciplines; Local Control or Air Control, Ground Control, and Flight Data/Clearance Delivery
is responsible for the airport "movement" areas, as well as areas not released to the airlines or other users. This generally includes all taxiways, inactive runways, holding areas, and some transitional aprons or intersections where aircraft arrive, having vacated the runway or departure gate. Exact areas and control responsibilities are clearly defined in local documents and agreements at each airport. Any aircraft, vehicle, or person walking or working in these areas is required to have clearance from Ground Control. This is normally done via VHF/UHF radio, but there may be special cases where other processes are used. Most aircraft and airside vehicles have radios. Aircraft or vehicles without radios must respond to ATC instructions via aviation light signals or else be led by vehicles with radios. People working on the airport surface normally have a communications link through which they can communicate with Ground Control, commonly either by handheld radio or even cell phone. Ground Control is vital to the smooth operation of the airport, because this position impacts the sequencing of departure aircraft, affecting the safety and efficiency of the airport's operation.
Some busier airports have Surface Movement Radar (SMR), such as, ASDE-3, AMASS or ASDE-X, designed to display aircraft and vehicles on the ground. These are used by Ground Control as an additional tool to control ground traffic, particularly at night or in poor visibility. There are a wide range of capabilities on these systems as they are being modernized. Older systems will display a map of the airport and the target. Newer systems include the capability to display higher quality mapping, radar target, data blocks, and safety alerts, and to interface with other systems such as digital flight strips
Risks involved:
1.Traffic:Several factors dictate the amount of traffic that can land at an airport in a given amount of time. Each landing aircraft must touch down, slow, and exit the runway before the next crosses the approach end of the runway. This process requires at least one and up to four minutes for each aircraft. Allowing for departures between arrivals, each runway can thus handle about 30 arrivals per hour.
Problems begin when airlines schedule more arrivals into an airport than can be physically handled, or when delays elsewhere cause groups of aircraft that would otherwise be separated in time to arrive simultaneously. Aircraft must then be delayed in the air by holding over specified locations until they may be safely sequenced to the runway. Up until the 1990s, holding, which has significant environmental and cost implications, was a routine occurrence at many airports.
2.Weather:Rain, ice or snow on the runway cause landing aircraft to take longer to slow and exit, thus reducing the safe arrival rate and requiring more space between landing aircraft. Fog also requires a decrease in the landing rate. These, in turn, increase airborne delay for holding aircraft. If more aircraft are scheduled than can be safely and efficiently held in the air, a ground delay program may be established, delaying aircraft on the ground before departure due to conditions at the arrival airport.
Inspite of all these, in the case where all of the options disappear,the sole burden of carrying the passengers safely to their destination rests on the pilot alone.Thus,the personnel,and the pilot,mostly should act with responsibility 

G.Abhilash Roy,
CS09B012.

Sunday, March 20, 2011

Risks and Responsibilities



Whirlwind was made by MIT,which is used for flight simulation.
It is an analogue computer which runs the dlight simulation.
It gives the altitude of the flight depending on the input we give.
It was widely used during the period before world war II (1940-1950).
It was of Dual use . It is used to train both military pilot and also civilian pilot.
The number increased rapidly during the world war II, to train the pilots.
The sage system was brought into action , which is derived from the whirlwind .
The prototype computer which is used in the sage computer tower is whirlwind.

The Tower consists of 4 floors ,without windows and the walls are of 6m thickness, so that they are blast resistive.
There were many parts in Sage control tower and was manufactured by IBM.

This system is used to analyse or detect any planes flying into the country .
These systems run 24x7 ,so the vacuum tubes of the computer were developed so that they dont breakdown .

But there is risk of malfunctioning of the system , if it does then there is a possibility to wage a war with other countries.
My second example is banking, during or before world war II most of the information required for the banks were stored by the humans and the accounts are also taken care by humans only.
so therefore it is ideal for computerization .

The computers can store all the data of their customers , without making any mistakes.It can handle the accounts of n number  of people .

The risk in this is that ,computers can be hacked and if the security
of the servers is not good enough the money in the bank accounts can be interchanged.
As the technology increases,the risk involved in it also increases. 

- N.HARDEV
CH09B066

The development and further research on Computers was associated with the perceived need of national security during the Cold War period. Discuss, with contemporary examples some of the risks and responsibilities involved with such justifications.


Abstract
This post discusses about further research on Computers which was associated with the perceived need of national security during the Cold War period and the risks and responsibilities associated during this competitive spirit between the two groups. Enhancing the knowledge on the risks and responsibilities by taking contemporary example of “Nuclear Bomb Testing”
Introduction:
What is cold war?
The Cold War (1947–1991), was the continuing state of political conflict, military tension, proxy wars, and economic competition existing after World War II(1939–1945) between the Communist World and the powers of the Western world. Although the primary participants' military force never officially clashed directly, they expressed the conflict through military coalitions, proxy wars, espionage, propaganda, conventional and nuclear arms races, appeals to neutral nations, rivalry at sports events, and technological competitions such as the Space Race.
Political Conditions during Cold War:
Immediately after the Second World War, there were differences between USSR and USA in splitting of Germany. During cold war the world was literally split into two, one being the predominant USA on one side and USSR on the other side. Because of Berlin Blockage, the differences between USA and USSR grew far more wider. The United States new status as a superpower, the central role of science and technology in the war effort, the massive wartime federal funding and the associated advancement of communal aims for science.
So due the division of the world into 2 major super powers there existed a need for natural security due to the “tense” differences between the 2 parties
FURTHER RESEARCH IN COMPUTERS DUE TO THE NEED OF NATURAL SECURITY
According to Edwards “practical military objectives guided technological development down particular channels, increased its speed, and helped shape the structure of the emerging computer industry”. But the feverish technical developments of WWII weaponry generated demand for huge numbers of computations to solve ballistics and coding problems — and, because of their urgency, for unprecedented rates of speed. It was to this end that programmable, electronic digital computers, capable of dramatically faster calculation, were developed.ENIAC was constructed by the US Army Ordnance Department to automate the tedious calculation of ballistics tables
WWII-era computers produced only limited impacts on the military, since they were used simply to speed up existing processes.
But these military projects did produce local concentrations of researchers working on electronic digital techniques, and these groups persisted after the war, providing the social and organizational nucleus for future research.

TAKING THE EXAMPLE OF NUCLEAR BOMB TESTING
Due to the cold war condition many countries started with the notion of national security. Many countries started improving their nuclear weaponry for the defense. Even developing countries like India and Pakistan started improving nuclear weaponry which led to acute tension in the world. A competition between two or more parties for the best armed forces existed during the cold war. Each party competes to produce larger numbers of weapons, greater armies, or superior military technology in a technological escalation.

RESPONSIBILITIES:
Defense is the main responsibilities for the use of nuclear weapons.
Protection of these nuclear weapons from terrorist forces
Correct usage of this weaponry
People should be informed about the needs and consequences of these nuclear attacks

RISKS
 After the Bombing on Hiroshima and Nagasaki there is no specific need to really highlight the risks. Everyone knows the risks of nuclear attack from other county and also the radiation caused due to the testing of nuclear weaponry

On the advent of the cold age, USA have carried about 1054 nuclear tests and UK carried about 715 nuclear tests France 210 nuclear tests UK and China 45 tests India and Pakistan 6 tests each. We can interpret the amount of radiation which will be added to the universe because of these explosions

CONCLUSION:
Now UNO is taking the responsibility of checking the least usage of the nuclear weaponry and maintaining integrity of the world. But these nuclear weaponry testing is still going on. There are many risks if these nuclear weapons fall into wrong hands like terrorists who just want to create an havoc situation in a country. So there are many risks for having and testing these nuclear weaponry

B.VAMSHI

Sunday, March 13, 2011

Role of Science & War during the 1930s


Abstract:

In this blog post, I’ll try to discuss the role of science & war during the 1930s by explaining about the emergence of Cybernetic vision & the Anti-aircraft(AA) weapons developed with the the mechanized soldier faced his opponent as a machine, and machines manifested themselves as people during the WW-II.

Vannevar Bush - Military Advisory, WWII:

In the 1930s, US military research was relatively small and disorganized. It was performed primarily by military staff and often duplicated between different branches of the military. On June 12, 1940, Vannevar Bush met with President Roosevelt to detail a plan for changing the organization of military research & proposed a new organization, called the National Defense Research Committee (NDRC), which would bring together government, military, business, and scientific leaders to coordinate military research.

Science at War – The Calculating Enemy :

In the mean time, three closely related sciences engaged this calculating Enemy : operations research, game theory, and cybernetics. Each had its own prototypical war problem. Operations research focused, for example, on maximizing efficiency in locating and destroying German U-boats in the North Atlantic and along the coast of the Americas. Game theory, though it had mathematical roots in the interwar years, exploded into view with John von Neumann and Oscar Morgenstern's masterwork of 1944, ’Theory of Games and Economic Be’ picked up the technique as a way of analyzing what two opposing forces ought to do when each expected the other to act in a maximally rational way but were ignorant both of the opponent's specific intentions and of the enemy's choice of where to bluff. Wiener, the spokesman and advocate of cybernetics, in a distinction of great importance to him, divided the devils facing us in two sorts. One was the "Manichean devil" "who is determined on victory and will use any trick of craftiness or dissimulation to obtain this victory." Wiener's rational Manichean devil could, for example, change strategy to outwit us. By contrast, the other, the "Augustinian devil" (and Wiener counted the forces of nature as such) was characterized by the "evil" of chance and disorder but could not change the rules.' Exemplary of the Manichean enemy, von Neumann's game theory postulated a logical but cunning opponent; it was designed precisely to analyze an antagonist who played against us and would bluff to win.

Cybernetic Vision – Blurring of Man & Machine boundary:

According to Galison, the system of weaponry and people that Wiener had in mind was predicated on a picture of a particular kind of enemy. On the mechanized battlefield, the enemy was neither invisible nor irrational; this was an enemy at home in the world of strategy, tactics, and manoeuvre, all the while thoroughly inaccessible to us, separated by a gulf of distance, speed, and metal. It was a vision in which the enemy pilot was so merged with machinery that (his) human-nonhuman status was blurred.

In fighting this cybernetic enemy, Wiener and his team began to conceive of the Allied antiaircraft operators as resembling the foe, and it was a short step from this elision of the human and the nonhuman in the ally to a blurring of the human-machine boundary in general. The servo mechanical enemy became, in the cybernetic vision of the 1940s, the prototype for human physiology and, ultimately, for all of human nature. Where Darwin had assiduously tracked the similarities between human and animal in order to blur the boundary between them, Wiener's efforts were devoted to effacing the distinction between human and machine.

World War II elevated the stakes of understanding the enemy's intention to survival itself; it stripped human behavior to moves of pursuit, escape, and deception; and it introduced a new class of self-regulating weapons. It is in this specific context that the identity of intention and self-correction was forged. And hence, the mechanized soldier faced his opponent as a machine, and machines manifested themselves as people.

Conclusion:
              In this way, Cybernetics which was intended by Wiener so as to use electrical networks to determine, several seconds in advance, where an attacking plane would be and to use that knowledge to direct artillery fire, the cybernetic vision changed the entire nature of war with Cybernetics, that science-as-steersman, made an angel of control and a devil of disorder.
                                                                                                                                   Submitted by: Syed Ashruf,
                                                                                                                                                               AE09B025.


                                                                                                                                                

Thursday, March 10, 2011

"What was the vision for science and War in the 1930's. Have things changed since then?"

The increasing role of technology in warfare in the modern era has brought science and war into an increasingly intimate relationship. The relationship between Science and War dates ages, but it had got enhanced during 1930’s Science and war involved in “development symbiosis”, helping each other. Research in Science was encouraged for the development in war and science played a critical role in war. I am going to discuss about the “vision”  of science and war in 1930’s and how did it change now.
The military funding of science has had a powerful transformative effect on the practice and products of scientific research since the early 20th century. Particularly since World War I, advanced science-based technologies have been viewed as essential elements of a successful military.
World War I is often called "the chemists’ war", both for the extensive use of poison gas and the importance of nitrates and advanced high explosives. Poison gas, beginning in 1915 with chlorine from the powerful German dye industry, was used extensively by the Germans and the British ; over the course of the war, scientists on both sides raced to develop more and more potent chemicals and devise countermeasures against the newest enemy gases.

human bomber
 
The vision of science for war not really cybernetic during the World War 1 except few developments in wireless communication, sound-based methods of detecting U-boats.
But the development after the WW 1 cybernetic as the military started investing on the research of science for the enhancement of military. Enemies were viewed as cybernetic entities and a man-machine.
On the Allied side, the three closely related sciences which engaged in calculating the enemy were: Operational research, game theory, and cybernetics. Operational research focused on maximizing the efficiency in locating and destroying German U boats in the North Atlantic and along and along the coast of America. Game theory is the way of analyzing what two opposing forces ought to do when each expected the other to act in a maximally rational way but were ignorant both of the opponent's specific intentions and of the enemy's choice of where to bluff. The need of national security led to development of scientific and technological institutions. With the use of machines in war, the military saw humans and machines to be equivalent.
After the atomic bomb drop, Wiener felt guilty for working for the war and stated that his work has been used in a way that he had no control over it?
Now science plays a crucial role in war and military.
It is the no. of human bombers decides the power of the country rather than the people.


In the 1930's the main motive for war was to gain power and resources. This is what defined the enemy in those years and hence the enemy was narrowed down to a military entity. Nowadays to stay secure almost every country has their own set of arms, nuclear weapons to make sure no one attacks them. Interest in winning the war, helped by economic support, permanently linked the military and science in a symbiosis which still exists.
The view of war has changed from a mode on issue settlement to race of destruction. There was disappearance of understanding of a human life among the scientist developing technologies.
After the Second World War, the advent of the Cold War concretes the links between military and academic science. From 1930 the bondage between the sciences and the military had been increasing and yet increasing now, with new ideas of “destruction”, and in future this cybernetic nature can be automatic machines without the involvement of human.

B.Vamshi
EE09B104


References
OnBush As we may think

Wednesday, March 9, 2011

Vision of science and war in 1930s


Vision of science and war in 1930s

There are many who speak as though science was responsible for war as though in the absence of scientific knowledge, war would not have been waged so disastrously. 
Nothing can be farther from the truth. Man fought fiercely with fists and cudgels and swords and axes when they had not mastered the technique of science.


Cybernetics

Cybernetics is the interdisciplinary study of the structure of regulatory systems.
Cybernetics is closely related to control theory and systems theory. 
Both in its origins and in its evolution in the second half of the 20th century, cybernetics is equally applicable to physical and social (that is, language-based) systems.
Cybernetics is an example of one such branch of science.
It was literally born out of war. It had many fundamentally revolutionary ideas at its root. 
The main concept behind it was that under extreme conditions like very high altitude, high mental tension and high speed aircraft manipulation, rather than fully processing the situation and then taking the decision, human mind behaves mechanically. 
This understanding paved the way to model human mind mathematically (at least in extreme situations). Then followed extensive research and result was high precision anti aircraft guns!

War is unavoidable because of the aggressive energy or pugnacious spirit of man. What science has done is to change the nature or front of warfare.By inventing the aircraft, science has introduced a new factor in modern war. 
Explosives are now rained down from above. 

A new dimension has been added to modern warfare with the introduction of missiles.The last Great War was brought to an end by two atom bombs that practically wiped out of existence of two Japanese cities—Hiroshima and Nagasaki.


conclusion
Many wrongly believe that scientific progress in peace must be less vigorous than in war time. According to them, this is bound to be so because in normal times, there is not the same sense of urgency as and when a country is at war and things go on a war footing. The tempo of scientific research is quickened because it multiplies profits in wartime. But war is never a permanent solution to any major problem of man. War or no war science will have a phenomenal march out of the inner urge of the scientist.

Gokul 
CH09B065

Monday, March 7, 2011

Vision of Science & War in 1930's


At the time, in the 1930s, the relationship between science and war was quite different from what it is today. Since then Science and Technology has made substantial impact in the field of wars. In the period around 1930’s, major advancement in technology of warfare has been seen.
Every nation has been involved in war at some time or other.As time passed all these nations have become more powerful and advanced.These days all nations have sophisticated weapons,advance defence mechanisms.
Wiener who coined the term cybernetics in 1947 to designate what he hoped would be a new science of control mechanisms in which the exchange of information would play a central role. The vision of science for war in the early 1900's was cybernetic. With the help of feedback loops, efficient control and communication was achieved which was a key point in a warfare. The battlefields were mechanized. Enemies were viewed as cybernetic entities and a man-machine.
At the time of WWII, on proposal of Vannevar Bush, President Roosevelt formed a new organization, called the National Defense Research Committee (NDRC), which would bring together government, military, business, and scientific leaders to coordinate military research and bush as its chairman .This committee led research in developing new weapons bush supervised the Manhattan project and developed the first atomic bomb and thus securing victory of the allies. Bush's work not only helped the Allies win the war, but it changed the way scientific research was done in the U.S. Bush demonstrated that technology was the key to winning a war, and in turn earned a new respect for scientists. Post- war, Bush argued that the nation would still need permanent support for research.In his reply President Roosevelt request he said
“It is my judgment that the national interest in scientific research and scientific education can best be promoted by the creation of a National Research Foundation.”
During 1930, publishing of newspapers was banned. The three closely related sciences which engaged in calculating the enemy were: Operational research, game theory, and cybernetics. Weiner divided the enemies into two categories, and regarded them as devils. One was the "Manichean devil" "who is determined on victory and will use any trick of craftiness or dissimulation to obtain this victory." The other, the "Augustinian devil" was characterized by the "evil" of chance and disorder but could not change the rules unlike the “Manichean devil”.
In the First World War science had certainly played a huge part in chemical synthesis for explosives, poison gas, aeronautics and much more. In World War Two the scientific community was thoroughly mobilized to serve the state for military ends, and this led to the continuing close connection between science and the state in the following decades. As time passes with the advancement of technology new graduates were allocated to all the important areas of defense research. By twentieth century science has become the language of war. Mutual interest in winning the war, reinforced by financial support, permanently linked the military and science in a web of cross-fertilization that continues today.

R.SURENDER NAIK
CH09B071

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Powered by Blogger | Printable Coupons