Aviation Sample Paper
|I have 4.5 pages or so started and 11 APA references. Items 1,2 and 3 seem to be mostly complete on my draft attached. I am looking for 11 more pages and up to 8 or 9 more references to address the rest of the items below. Here is the rest of the instructions: |
The central part of the portfolio is the professional essay. It should be 15 or more pages long and represents the candidate’s best writing. The essay should NOT be adopt an autobiographical, casual, or informal tone. Please use a FORMAL, ACADEMIC TONE. It should:
1. Offer an overview of the Program of Study, identify the field of study the candidate has chosen to pursue DONE
2. Explain why those choices were made MOSTLY DONE
3. Explain how the candidate’s thinking about the discipline has changed with professional growth. MOSTLY DONE
4. Identify the leading issues confronting particularly the major field today, the “big questions” that the field is debating, and the kinds of research and evidence gathering that dominate current research.
5. Provide a concise discussion of the candidate’s long-term ambitions as a scholar.
6. Identify what is lacking or inadequately understood in the field now?
7. What interpretations need more evidence or more questioning?
8. What kind of “trajectory” (or long-term research strategy) does the candidate envision for his or her work, at the dissertation stage and beyond?
9. What research questions seem most important and significant to pursue in the intended field of research?
10. The essay should make a case for the value of the fields of study being pursued.
11. Identify the significance of the work?
12. Identify what non-specialists and the public expect to learn from the field, and why is that useful?
Furthermore, the candidate should indicate where he or she stands on those questions and venture appraisals or criticisms of key contemporary works in the field. Overall, the candidate should demonstrate a firm understanding of the most influential works and interpretations in the field.
The coursework that consisted of my Program of Study is an exploration of the organizational theory of safety at aviation testing manufacturers with a solid research background. My Master’s program at Embry Riddle Aeronautical University (ERAU) began the basis for the research background with MAS 605 Research Methods and Statistics. A foundation for diverse research methods and statistical analyses was set after this class. Additionally, the Graduate Research Project I completed at ERAU used a quantitative research approach to the hypothesis I had proposed. I expanded my research foundation begun in my Master’s program by beginning with ASCI547 Qualitative Research. Prior to this class, I had never considered using qualitative research to provide the data for my dissertation, but after this class, this methodology will be under consideration. I followed up with ASCI 546 Quantitative Research to continue developing my research breadth and depth. I completed the research required portion of my coursework with BSH 601 Research Methods in Behavior Science, which allowed me to create and observe even more practical uses of both qualitative and quantitative research. ASCI 521 Organization Theory in Aviation was chosen as the first of several organizational courses to expand my knowledge of why aviation-based firms were originally organized up through their current state. ORLD 565 Future Focused Leadership was also complete to raise my awareness of organization leadership and new leadership methods to help improve an organization. The final organizational course taken was ORLD 545 Leading Organizational Change. This class furthered my understanding of the current organizational structure. In addition, it allowed me to build upon my aviation safety career experience, I attended ASCI 522 Aviation Safety Programs, where I built upon my current understanding of aviation safety programs throughout the aviation industry. Finally, I completed ASCI 598 Graduate Reading, where I explored the latest organizational and leadership theories through peer-reviewed journal articles.
The researcher has worked in the Aviation Industry for over 20 years. The first 9 years was spent working on software configuration at a large avionics design and manufacturing firm. The last 12 years have been spent working at one of the large aerospace design and manufacturing. The work consists of doing design system safety on tactical fighter jets, accident investigation of the said jets, and test safety oversight of tactical fighter jets and unmanned aerial vehicles. The safety of the aviation industry was paramount when I began my career in software configuration. The sole purpose of the new design and upgrades of existing avionics was to create a safer aviation system. This applied to everything from Traffic Collision and Avoidance systems to Weather Radars aboard commercial and military aircraft to redesigning the Air Traffic Management system to something more user friendly and efficient. Performing design system safety, test safety, and accident investigation at the aerospace manufacturer also has safety embedded in the employee’s and managers. This applied to all facets of safety including: personnel safety during the manufacturing process to the safety of the pilot during a test flight to collecting lessons learned from a mishap to prevent future occurrences. The author’s personal observations, (that have been reinforced over the 20 year career) are that a unified approach to safety is missing. There are low level leaders of the different safety disciplines, but no one top leader to unify the approach and which to coordinate lessons learned where the disciplines overlap. The researcher has gained knowledge through years of observation and also through reading peer reviewed journal articles that a non-silo safety approach appears to create a better safety culture. A unified safety organization can share lessons learned, communicate issues that may affect multiple safety disciplines.
The recent Safety Management Systems (SMS) requirement by the Federal Aviation Administration and International Civil Aviation Organization (ICAO) illustrates the need for unified safety organization at aviation related industries from the manufacturer to the operator. SMS can be described as organizations that manage safety risks systematically as well as maintain and improve safety performance. According to the FAA, SMS is the formal business-like approach that is used to manage safety risk using a top-down approach. It encompasses systematic procedures, policies and practices related to safety management (FAA, 2010, p. 8). SMS is made up of four core components, which include Safety Assurance, Safety Policy, Safety Promotion, and Safety Risk Management. Safety Policy comprises safety accountabilities, SMS documentation, management commitment, coordination of response during emergencies, and appointment of personnel. Safety Assurance addresses matters related to the measurement and monitoring of safety performance. Under Safety Risk Management, the issues addressed include hazard identification, risk analysis and control, and mitigation. Safety management systems are designed to address safety issues using both the top-down and bottom-up approaches. Safety Promotion is founded on aspects of education, training and communication.
After completion of the St. Louis University Ph. D program, the researcher would like to continually stay engaged in following organizational trends and leadership structures that demonstrate the most continuous improvement in aviation safety organizations (Bass, 1985). This includes the reduction in near misses, injuries and accidents and ultimately a safer culture. The ultimate personal goal for this researcher would to be to continuously review and collect data and to influence his current aviation safety organizational structures. The plan would be to evolve into the safest arrangement continually dropping not only the personal injury rate of employees designing, testing and building the aircraft, but also reducing the accident and incident rate of the fielded customer platforms
Aircraft accidents, personnel injuries and minor incidents continue to happen even though the industry has been around for over 100 years. From hull losses of commercial air carriers to the recent losses of gulfstream test aircraft and the Airbus A400M aircraft, the industry has demonstrated it continues to make grave mistakes.
Reason’s model of accident causation includes 4 causal factors. These factors break down into the following classifications; Absent/Failed Defenses, Active Failures and Unsafe Conditions, Latent Unsafe Conditions, and Organizational Factors and Decisions. The researcher would like to focus on Organizational Factors and Decisions. Not only is it one of the links in the chain that may prevent a mishap from happening, it is the one pillar that can have positive influence on the other 3 pillars.
The organization can learn from each mishap or near miss investigation and then take a look at the whole of the organization. Reviewing the whole of the organization may find underlying issues that can be applied to even mishaps that have an unrelated cause. However, with mishaps continuing occur at a steady to rate in the commercial carrier world.
In the researcher’s view, the authorities tasked with the running of the aviation industry have the greatest mandate in ensuring that the industry is kept in check. For instance, the NTSB (2015) believes that aviation safety can be enhanced if only the staff adopt and thrive in a climate of cooperation and honesty, where aviation professionals can admit to mistakes without fear of being reprimanded. If the professionals have the option of coming forward and admit to bad judgment and mistakes made, the industry would become safer owing to the creation of enabling environment resulting from such a just culture. Unlike a blame culture, a just culture, helps the forensics team find out the cause of the accident and come up with ideal ways of tackling it (Jeffcot, Pidgeon, Weyman & Walls, 2006). A culture in which aviation professionals are blamed and severely criticized for inadvertent errors discourages personnel from admitting errors and filing them thereby shrinking the available safety information pool (Hales, Pronovost & Peter 2006).
In the post-accident forensics phase in aviation, it is generally accepted that accidents do not just happen as isolated cases. Rather, most of them are the result of a chain of events that culminate into a sequence of unsafe acts by the aircrew and the other ground aviation-related professionals. Nonetheless, during and after such forensics, the aviation professionals ought to own up to their mistakes. In the aviation industry, human error reporting and admittance of previous mistakes is a leading contributor of safety information (Goh & Wiegmann, 2002).
In the aviation sector, the collection of metadata provides first-hand real time data that contributes to the aviation safety (Barling, Kelloway, & Louglin, 2002). Despite, being one of the leading preventers of accidents, the human factor is paradoxically the leading source of mishaps. In modern times, studies in aviation safety have now put the human factor in a critical category of the variables that play the most profoundly contributory role as far as aviation accidents are concerned (Goh & Wiegmann, 2002; Shimomura & Kimita, 2012). Safety practitioners have currently broadened the description of human error to include other concepts like unsafe supervision and organizational influences (including operational processes, resource management, and organizational climate). For instance, in the maritime sector, almost 80 percent of the commercial and recreational accidents are traced backed errors arising from humans despite regulatory, quality controls, and education ingenuities that exist (Shimomura & Kimita, 2012).
In trying to curb the dangerous trend, stakeholders in commercial aviation have come up with different ways of encouraging and managing reports originating from the aviation professionals. In the light of the many accidents caused by the professionals, the industry, has come up with what is now referred to as the Confidential Incident Reporting System. The systems bears information on the past accidents and incidences in the aviation industry. The system analyses these past incidences and gives professionals an in-depth analysis of aviation accident and incident data that includes ASRS data. The system is explicit in identifying the identical human error patterns in existence, and how each error can be managed in a proactive environment. Such systems have now become “smarter” and are more heavily reliant upon response to the global economic demands in the presence of the ever-present risks posed by a wide range of human operators (Shimomura & Kimita, 2012).
Humans by nature are error prone and, from time to time, they fall behind in terms of training and consequently safe operation of new technology. This in itself makes them a liability in operating the various intricate technologies used in aviation. Many systems in this industry have now been transformed from the ordinary paper-based documentation that characterized the older forms of communication to a more integrated real-time digital system. Technological changes, combined with the greatly amplified numbers used in flight operations, have intensified the risk of aviation accidents. As incidents and mishaps continue to occur, feedback from the related reporting systems are quickly becoming a vital early-warning tool for all decision makers and administrators tasked with improving the safety regulations in the face of a doubled, sometimes quadrupled, scope of operations (Moseman, 2011).
As earlier noted, some of the most famous incidents and accidents involving aircrafts have been caused by human error. The errors may include things to do with pilot errors, miscommunication and technical hitches. After aviation accidents and related disasters, there is always a probe that leads to some prosecutions. In numerous cases, the professionals involved deny their mistakes at the outset, leading to protracted litigation processes.
Pilot error is one of the most common mistake leading to aviation accidents. For instance the factor was at the heart of the Eastern Air Lines Flight 401 crash that occurred in Miami, Florida (Chou, Madhavan & Funk, 1996). On December 29th, 1972, a flight captain, first officer, and flight engineer tried sorting out a problem with a faulty landing gear system but did not know that the plane’s in flight controls had been bumped. A crew member who ended up altering the autopilot settings and set the plane from a level flight to a slow descent. A similar incident took place in Wayne’s borough, Virginia, with one of the planes operated by Piedmont Airline where a pilot suffered from mental breakdown and got incapacitated and an unexperienced crew took charge and erred at the auto-pilot settings. The crew member was confused by instruction from the ground support and he ended up crashing the plane. The ground support crew was sued but they did not admit to their mistakes. Such a practice may be blamed for the recurrence of similar mistakes that have led to a further loss of lives.
In a dissimilar but extraordinary example, the flight crew of Atlantic Airlines flight 234 was instructed by ground support to ground their plane in a sparsely populated area due to the difficulty in their landing gear system. The crew was to work this out using very limited lighting and therefore they lost their course and did not even realize that they were dropping too fast and eventually crashed in the Everglades. As it were, human error in decision making was blamed for the horrible crash, leading to the prosecution of the ground support staff. The National Transportation Safety Board’s (NTSB) report on the accident blamed the flight crew for inadequately applying their extensive in-flight troubleshooting information in monitoring the aircraft’s apparatuses properly. Additional details on the mishap have since been an important reference point and material for various case studies used in the training exercises by the new aircrews and other members of the air traffic control.
Establishing a just culture in the aviation industry provides the best solution in terms of reduction of accidents and incidents. The present scenario is very different because it allows room for perpetuation of a blame culture. Aviation professionals have in many cases ended up taking the blame for accidents and incidents attributed to collective mistakes by the larger group (Hofmann & Morgeson, 1999). On the other hand, the fear of prosecution hinders aviation professionals from admitting mistakes and that contributes minimally to the collection of safety information (Johnson, 2011).
In 2004, the NTSB reported that pilot error was among the top primary causes of aviation accidents (NTSB, 2004). Based on the statistics, 78.6% of the total aviation accidents were as a result of some form of error on the part of the pilot (NTSB, 2004). This type of error also accounted for 75.5% of all the all the general aviation related accident reported in the US overall tally (NTSB, 2004). Incidentally, only about 23% of all the accidents that had both pilots and other aviation professionals as the direct cause, admitted and owned up to their mistake (NTSB, 2004). Even in other planned air transport related mishaps and accidents, human or crew errors still account for over half of all accidents, yet criminalization has resulted in only a few instances of conviction (Hokstad, Vatn, Aven & Sorum, 2004; Ciavarelli, 2007). Moreover, these cases do not often help the aviation sector in the long run due to non-cooperation culture (Hokstad, Vatn, Aven & Sorum, 2004).
The Singapore Airlines flight 006 in 2000 was another classical example of miscommunication that the aviation sector grapples with from time to time (Henderson, 2003). It not a common phenomenon for a plane to collide with a moving object because common practice demands that runways are supposed to be clear during landing and takeoff. However, the Singapore Airlines aircraft ran into a bulldozer. The plane taxed to take-off point during stormy weather under the control of poorly informed ground crew. In the middle of the storm, the pilot and ground crew grappled with low visibility and their efforts did not amount to much because they steered the Boeing 747 off towards the wrong runway that was muddled with concrete barriers, excavators, and a bulldozer. It had been closed for repairs. The pilot did not see them until the plane was too close to stop, so it rammed into the bulldozer. Inquiry into the incident gave indications that the pilot failed to read a report on the repairs that had been issued months earlier. The report had clearly stated that the runway was due for closure for renovation and hence no planes would use it until further notice. Evidently, this was a case of ignorance on the part of the pilot for not acting on information that had already been made public. It is in the absence of this information that the pilot began his take-off procedures on the wrong runway. Similarly, the ground crew lacked critical information regarding the runway, possibly because they too had not read the report. The accident claimed the lives of 83 of the 179 passengers on board, including four crew members. After the incident, the ground crew and all the involved parties were charged with recklessness, but most of them were found not guilty due to scarcity of evidence. Again, none of the crewmembers accepted responsibility in light of the prosecutions going on, further highlighting the need for cooperation and the need for a just culture in aviation.
The other major incident that led to criminal prosecutions was the PSA 182 (1978) in California, whereby a Cessna plane rammed into a bigger Boeing 727’s wing two minutes after transmitting communication to ground support crew. While on a routing trip to San Diego from Sacramento, Pacific Southwest Airlines encounter a private Cessna aircraft whose pilot was giving some flying lessons to a student. The two plans avoided each other the first time but miscommunication with air traffic control as well as panic and nervousness on the part of the Boeing 727 crew led to the occurrence of the accident. A communication mix-up led the controllers to make the wrong assumption that no further information or clarification regarding the Cessna’s location was needed. The Southwest Airlines aircraft’s engine malfunctioned midair after it was hit and destroyed by the smaller private aircraft. The gruesome accident, which claimed 135 passengers and 7 people on the ground, triggered a major public outcry.
Such accidents have for a long time increased pressure on authorities to enhance safety and criminalize aviation accidents whose liability is directly attributable to negligence, recklessness, gross misconduct, and unprofessional behavior (Grawitch & Ballard, 2016). In the case of the Southwest Airlines crash, authorities found out that there was a problem with communication between the ground crew and the flight crew, which directly led to the accident. Besides, it led to prosecution of the ground crew members who were accused of sheer negligence and disregard for the aviation error management systems in place. For instance, they could have notified cabin crew members about the danger in due time. Undoubtedly, clear and constant communication by air traffic control could have led to the avoidance of the disaster. In the aftermath of the prosecutions, none of the air control members accepted the responsibility and the prosecutions were only made based on the forensics and not by testimonies, which would offer more accurate metadata about the systems and the constraints of air traffic control. However, at the time, the aviation industry did not have any such outlets and the team ended up withholding information that was vital for aviation safety.
Today, post- crash litigation are still more or less the norm, and there only a few cases where the process has been enhanced by the establishment of a climate of cooperation and honesty whereby aviation professionals confess their poor decisions without the fear of unneeded reprimand (Amouzegar, 2002). Thus, today’s aviation safety system is characterized by the absence of an enabling environment for aviation safety specialists to gather and synthesize information in a centralized manner from firsthand accounts by the parties to find out what really happened with the aim of prevent recurrence. The silo-based safety culture is still an integral reference point in regards to the operations of aviation professionals. It is this type of culture that normally sees them blamed and held accountable for inadvertent errors. In turn, this discourages the front-line operators from reporting most kinds of mistake (Cavanagh et al, 2010). Consequently, similar accidents keep recurring, exemplifying unsuccessful attempts by stakeholders in the aviation industry to reduce accidents and incidents (NTSB, 2015). For safety information to be shared freely and aviation safety reinforced, such a dangerous organizational culture must be eliminated.
Based on this discussion, the researcher proposes the following research questions to act as a basis for a more in-depth study on the topic:
- How can the aviation industry establish a just culture as one of the most important solutions towards the reduction of accidents and incidents that are crippling the aviation industry?
- Does the current setup of the aviation industry perpetuate the emergence and growth of a blame culture that has led to serious safety-related challenges?
- What are some of the reasons that have led to aviation professionals taking the blame for accidents and incidents attributed to collective mistakes by the larger group in the industry?
- Does the fear of prosecution hinder the aviation professionals from admitting mistakes and hence triggers very little in terms of the collection of safety information?
Aviation accidents often have catastrophic consequences in terms of finances and human lives. The industry invests more than 25% of the total revenue on safety owing to this impact (ATSB, 2015). Nonetheless, the industry has not really done much to enhance a just culture among the crewmembers (Geller, 2000). Without changing the organizational framework within which safety issues are addressed, increased spending on aviation safety may not bring about the desired change. This view is reflected in the appreciation of the need for a just culture as a foundation upon which tangible measures are implemented to improve safety based on the information collected (Geller, 2000). The existence of this underlying problem is reflected in the aviation disasters that continue to occur despite the introduction of state-of-the-art technology in the industry (Wiggins & O’Hare, 2003). For example, in the aftermath of the disappearance of the Malaysian flight 370 on March 8th 2014, video footage indicated that some of the passengers boarded the plane without going through the necessary security checks. Such negligence highlights how reckless the staff can get on a normal day, and it has little or nothing to do with the safety technologies deployed or the amount of information accessible to policymakers in the industry (ATSB, 2015).
Adopting a just culture in which a delicate balance between accountability and safety is maintained is a training-intensive affair that is heavily dependent on the personnel involved. Unfortunately, some professionals do not really bother about adopting it, they never admit to making mistakes no matter how much accountability-related training and induction programs they go through (Easterby-Smith, Thorpe, & Lowe, 2002). Therefore, it means that a just culture can only be effective right at the recruiters’ level. All potential employees must go through a rigorous interview process aimed at determining their ability to conform to the just culture (Glista, 2003). After that, all aviation firms must monitor their employees around the clock and adopt corrective measures whenever they detect negligent behavior among some of the members of the crew.
From another perspective, a major enabler of the current just culture is the existence of numerous precedents in which aviation professions were blamed for errors that were not of their own making. The tendency to blame a section of people or department once an accident occurs not only demoralizes staff but also discourages information sharing and genuine, full-hearted inquiry into the real causes of a disaster with the aim of obtaining lessons to prevent recurrence. The blame game not only does little to help in the development of a just culture but also inhibits efforts to strengthen safety management in the aviation industry.
At this point, it is imperative to put into context the three basic types of errors that occur in the aviation industry; they include decision errors, skill-based errors and perceptual errors. The personnel may make skill-based errors and decision errors but in spite of the sort of damage they may be triggered, there is still an aspect of collective blame. For instance, recruiting, on-job training, poor briefing, faulty equipment, and poor communication may all be behind a single skill-based or decision error. The researcher is of the view that an in-depth analysis into major disasters such as the ones highlighted in this essay may go a long way to show that there is always a chain of errors that precede a catastrophic mishap that culminates in disaster. Therefore, stakeholders in the aviation industry must encourage a just culture in the assessment of accidents and incidents if a long-term solution to the safety problem is to be achieved (Moseman, 2011).
The aviation industry faces a pervasive challenge of an ongoing struggle to continue improving the way in which safety management systems are reviewed. In this regard, the greatest threat is posed by personnel as opposed to technological shortcomings. According to Moseman (2011), aviation safety can only be enhanced if staff adopt and thrive in climate of cooperation and honesty where owning up to mistakes is not automatically translated into litigation proceedings, prosecution and reprimand. The best way to achieve this goal is through the adoption of just culture among the staff working across the aviation industry. Such a culture ensures that everyone contributes meaningfully and wholeheartedly in the prevention and recurrence of mistakes that lead up up to incidents. The adoption of this culture among the professionals makes it easier for staff members to make confessions and document mistakes made so that the information can be used by new pilots for future reference and for purposes of evading accidents. A culture in which aviation professionals are blamed and held accountable for their inadvertent errors discourages the operators from reporting their mistake with the consequence being a reduction in the amount of aviation safety information being shared.
Amouzegar, M. (2002). Supporting Expeditionary Aerospace Forces: Alternatives for Jet Engine Intermediate Maintenance. Santa Monica, CA: RAND Corporation.
ATSB. (2015). Aviation safety – Aviation Publication – Evacuation Commands for Optimal Passenger Management. Web.
Barling, J., Kelloway, K. E. & Louglin, C. (2002). Development and test of a model linking safety-specific transformational leadership and occupational safety. Journal of Applied Psychology, 87(3), 488-496.
Bass, B. M. (1985). Leadership and performance beyond expectations. New York, NY: Free Press.
Cavanagh, James F.; Frank, Michael J.; Allen, John J.B. (2010). “Social Stress Reactivity Alters Reward and Punishment Learning”. Social Cognitive and Affective Neuroscience, 6(3): 311–320.
Ciavarelli, A. (2007). Safety Climate and Risk Culture: How Does Your Organization Measure Up? Web.
Easterby-Smith, M., Thorpe, R. & Lowe, A. (2002). Management Research: An Introduction (2nd Edition), SAGE Publications Ltd, London.
Geller, S. (2000). 10 Leadership Qualities for a Total Safety Culture. Professional Safety, 45(5), 38-41.
Glista, T. (2003). FAA/Industry Training Standards (FITS): Times (and training requirements are a changing). FAA Aviation News, 42(4), 1-6.
Goh, J. & Wiegmann, D. (2002). Human error analysis of accidents involving visual flight rules flight into adverse weather. Aviation, Space, and Environmental Medicine, 78(8), 817-22.
Grawitch, M. & Ballard, D. (2016). The psychologically healthy workplace: Building a win-win environment for organizations and employees. London: Routledge.
Hales, B., Pronovost M., Peter J. (2006). The Checklist — A Tool for Error Management and Performance. Journal of Critical Care, 21, 231–235.
Hofmann, D. & Morgeson, F. (1999). Safety-related behavior as a social exchange: The role of perceived organizational support and leader-member exchange. Journal of Applied Psychology, 84, 286-296
Hokstad, P., Vatn, J., Aven, T. & Sorum, M. (2004). Use of risk acceptance criteria in Norwegian offshore industry: Dilemmas and challenges. Risk Decision and Policy. 9(3), 193-206.
Jeffcot, S., Pidgeon, N., Weyman, A, & Walls, J. (2006). Risk, Trust, and Safety Culture in U.K Train Operating Companies. Risk Analysis, 26 (5), 1105-1121.
Johnson, Stephen B. (2011). System Health Management: With Aerospace Applications. Hoboken, N.J: Wiley.
Moseman, James (2011). New risk acceptance criteria for process safety. Process Safety Progress, 31(1), 6–8.
NTSB. (2015). Aviation Safety Studies & Special Report – Emergency Evacuation of Commercial Airplanes. Web.
Shimomura, Y. & Kimita, K. (2012) “The Philosopher’s Stone for Sustainability.” International Conference on Industrial Product- Service Systems. Tokyo: Springer. 539.
Wiggins, M., & O’Hare, D. (2003). Expert and novice pilot perceptions of static in-flight images of weather. The International Journal of Aviation Psychology, 13(2), 173-187.