SlideShare una empresa de Scribd logo
1 de 89
Descargar para leer sin conexión
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS
Washington, DC | August 14-17, 2013
Washington Hilton
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS
Washington, DC | August 14-17, 2013
Edited by: Jesse L. Bemley, Ph.D

© Black Data Processing Associates, August 14, 2013

2 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

CONTENTS
A LETTER FROM DR. JESSE L. BEMLEY ............................................................................................. 4
THE STORY OF THE IT SHOWCASE ................................................................................................... 5
2012 IT SHOWCASE WINNERS ............................................................................................................. 6
MEET THE IT SHOWCASE JUDGES .................................................................................................... 8
IT SHOWCASE PRESENTATION SCHEDULE AT A GLANCE ....................................................11
WEDNESDAY, AUGUST 14 ........................................................................................................................11
THURSDAY, AUGUST 15 ...........................................................................................................................11
KEYNOTE SPEAKER ...............................................................................................................................12
HIGH SCHOOL PRESENTATIONS AT A GLANCE ........................................................................13
MEET THE HIGH SCHOOL STUDENT PARTICIPANTS ..............................................................14
MOTION SENSOR TECHNOLOGY IN VIDEO GAMING ............................................................17
FUZZY LOGIC ...........................................................................................................................................20
CYCLOTRONS: FROM THE LAB TO THE FIELD ............................................................................27
THE FUTURE OF AIR TRAVEL.............................................................................................................30
UNDERGRADUATE PRESENTATIONS AT A GLANCE ...............................................................34
MEET THE UNDERGRADUATE PARTICIPANTS ..........................................................................35
NEURAL NETWORKS .............................................................................................................................38
SECURING BIG DATA ...........................................................................................................................45
TERRAIN CLASSIFICATION ................................................................................................................51
THE EXPANSION OF CYBER SECURITY...........................................................................................58
THE TRUTH ABOUT BIOARTIFICIAL ORGANS............................................................................69
MODERN WEARABLE TECHNOLOGIES .........................................................................................76
CRASH COURSE CLOUD COMPUTING ...........................................................................................81

3
A LETTER FROM DR. JESSE L. BEMLEY
Welcome and thank you for participating in the 11th
Annual National BDPA IT Showcase. This will be an
exciting three days!!!!
These proceedings contain papers from 11 students,
IT Showcase history, a list of past winners and the
photos of winners since 2008.
The Presentations/Projects were judged earlier this
morning in two categories, college (undergraduate)
and high school. Three awards will be made in each
category for 1st, 2nd, and 3rd place. Certificates to all
participants are to be presented at the end of the IT
Showcase session on Thursday afternoon. Awards
for first, second, and third place winners are
presented at the Awards Banquet on Saturday.
During the first decade, 2003-2012, of the IT Showcase, there were 42 undergraduate papers and
66 high school papers presented. The students represented 14 states, 24 high schools, 12
universities and two community colleges. Their papers covered database technology, web design
technology, wired and wireless communication technology, IT security, data mining, soft
computing, high performance computing, cloud computing, virtual technologies,
nanotechnology, robotics, operating systems, IT certification, social media, Bid Data, health
information systems, social media, etc.
Again, thanks for joining us at the BDPA IT Showcase!

Jesse L. Bemley, Ph.D.
BDPA IT Showcase
Conference Manager

4 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

THE STORY OF THE IT SHOWCASE
The idea that led to the creation of the IT Showcase has taken several twists and turns over the
years. As far back as the late 1980s, a UNISYS communications engineer in Philadelphia talked
about a BDPA computer science fair. The computer science fair would be patterned after the
traditional high school science fair.
The idea was put on the back burner because of the all-consuming activities of the pilot HSCC
which was held in Atlanta in 1986. During the New Orleans Convention in 1987, the Artificial
Intelligence Model of a Black Teenager was presented by three teenage girls. The model was an
expert system developed from the AI language Prolog. The student presentation was a
component of Dr. Jesse Bemley’s workshop.
Bemley’s National Conference workshops included high school students from 1989 – 1992. The
students’ names were not a part of the conference program. Instead the workshops had separate
programs as handouts.
In 1993 Margaret Jennings suggested that Bemley’s students participate in the Youth Conference
as the High School Computer Science Fair at the BDPA National Conference in Kansas City, MO.
For the very first time students names were published in the official conference Youth Activities
Guide. Five high school students presented papers. Their research areas included expert systems,
logic puzzles, neural networks, and fractals. The activity continued until the 1997 conference in
Houston, TX.
There were no further computer science fairs. The national conference co-coordinator did not
want students making presentations to Youth Conference participants, only adult presenters.
In 2001 Dwight Huggett, then HSCC coordinator, proposed an IT Showcase where the projects of
college and high school students would be highlighted. The effort did not get off the ground.
There was a subsequent attempt in 2002. Again, the resources were not available.
In 2003, BDPA President Elect Wayne Hicks asked Bemley to accept the challenge, which he did.
Hicks wanted an event that would keep the HSCC alumni moving academically toward the
Ph.D. Bemley modified the requirements for participation to include: a 10 page paper in a leading
edge topic, a 15 minute Powerpoint presentation, a 3ft by 4ft poster which graphically depicts the
paper, a one page bio with photo, and a trip report due a week after the conference. The 2003
BDPA National Conference in Philadelphia hosted the first IT Showcase. Washington, DC is
hosting the 2013 BDPA National Technology Conference and the 11h BDPA IT Showcase.
The Cincinnati Chapter hosted the first regional IT Showcases in 2005 and another in 2011. There
have been several unsuccessful attempts at regional/local IT Showcases in subsequent years. The
BDPA-DC Regional Technology Conference in Washington, DC in 2007, 2008, 2009, 2010, 2011
and 2012 held very successful IT Showcases. The Northern Delaware Regional IT Showcase was
held on May 15, 2010. It can be used as a model for other regional IT Showcases who don’t wish
to use the traditional IT showcase paradigm. Greater Columbia Chapter hosted the Washington,
DC Chapter in 2012at a very successful Southeast Regional IT Showcase.
5
2012 IT SHOWCASE WINNERS

First Place 2012 High School Winner
TYRA NATORI FOULKS
Irmo High School
Columbia, SC

Second Place 2012 High School Winner
WESLEY WALKER
Groveport Madison High School and
Fairfield Career Center
Groveport, OH

Third Place 2012 High School Winner
BRANDI TAYLOR
Irmo High School
Columbia, SC

6 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

First Place 2012 Undergraduate Winner
MICHAEL BIJOU
Bowie State University

Second Place 2012 Undergraduate Winner
BRYAN BEMLEY
Bowie State University

Third Place 2012 Undergraduate Winner
ADWAIT WALIMBE
University of Minnesota

7
MEET THE IT SHOWCASE JUDGES

Mr. Curtis Roberts is the Enterprise Architect for the
National Satellite Data and Information Service whose
mission, the acquisition and dissemination of environmental
data from satellites, and the development of data products
used worldwide, and is an office of the National Oceanic

Chief Judge
CURTIS ROBERTS
NOAA
National Satellite Data
and Information Service

and Atmospheric Administration. Mr. Roberts comes to the
National Oceanic and Atmospheric Administration from the
Department of Housing and Urban Development (HUD)
Washington, District of Columbia as a Senior Information Technology Specialist and Project

Manager with the Office of Systems Integration and Efficiency where his responsibility was to
support the mission to increase homeownership, support community development and increase
access to affordable housing.
Mr. Roberts, a 15 year US Navy veteran entered the Federal Civilian Service with HUD in 1989.
Previously a Department of Defense civilian contractor, Mr. Roberts preformed Information
Technology technical services supporting the Office of Secretary of Defense, Vice Chief of the
Army, and Aviation Engineering Logistical Support for US Navy, and Naval Air Systems
Command.
Understanding the need for continued education in Science, Technology, Engineering and Math,
Mr. Roberts holds an advisory position with the Hampton Roads Chapter of BDPA, and an
advisory relationship with non-profit community based educational organizations whose interest
teaches STEM topics to students from Grades 4 through 12.

8 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

Fabianna has over 14 years’ experience in the IT
industry. She began her IT career in IBM Global
Services where she focused on developing and
supporting ecommerce, Java, AS/400 and Microsoft
applications. In 2000, she began her career at
Monsanto as a contractor on the Web Engineering
Team and transitioned to a Monsanto employee in
2001. During her career at Monsanto, she has held
various roles including Project Manager, Web Server
Security and Implementation engineer, Windows
Security Administrator, and Web, Database, and
Portals Advanced Support Lead. She currently leads
the Middleware Services Advanced Support Teams
which support emerging platforms such as Business
Objects, Informatica, BPM (Lombardi), GeoSpatial.
In addition to Fabianna receiving her B.S. and M.S. in

Judge

Computer Science from Clark Atlanta University with a

FABIANA SOLARI

Concentration in Data Warehousing, she has maintained
a focus on leadership and team development and
training for youth, college students, and professionals.
Over the past 8 years, she has focused on her passion of

Middleware Services
Advanced Support Teams
Monsanto

helping others leverage their talents and strengths to
achieve their goals. This has been accomplished via coaching and mentoring others as well as
facilitating sessions such as Strength Finders, DiSC, SOAR, iSpeak, Asset Based Thinking,
Transformational Conversations, and Effective Presentation and Communication Skills.
At Monsanto, Fabianna is member of the African Americans in Monsanto (AAIM) Leadership
and Retention Committee, GI Leadership Development Committee, Monsanto BDPA Committee,
Women’s Network, Women in IT, Young Professionals, and Women’s Leadership Giving
Initiative. She also serves as one of two Monsanto liaisons for the St. Louis Chapter of the
National Black MBA Association. Fabianna enjoys spending time with her nieces, nephews,
godchildren and youth ministry teens. She also enjoys traveling to a new destination each year,
cruising, vacationing in the Caribbean, and trying new recipes.

9
Dr. James M. Turner leads NOAA's international scientific
and environmental efforts associated with the global
oceans, atmosphere, and space. He serves as the principal
advisor to the Under Secretary and Administrator on
international policy issues, represents NOAA and the
United States with foreign governments and international
fora, establishes policies, guidelines, and procedures for
NOAA’s international programs, and provides support
and coordination to NOAA’s lines offices. These efforts
help us to better understand, predict, and take steps to
respond to changes in the Earth's environment, conserve
and manage coastal and marine resources, protect life and
property, and to provide decision makers with reliable
scientific information.

Judge

Dr. Turner comes to NOAA from the U.S. Department of
Commerce’s National Institute of Standards and Technology
(NIST) where he served as the Acting Director (September
2007 to September 2008) and Deputy Director (from April to
September 2007). NIST promotes U.S. innovation and
industrial competitiveness by advancing measurement
science, standards, and technology.

DR. JAMES M. TURNER
NOAA
Director of the Office of
International Affairs

Prior to joining NIST, Dr. Turner served as the Assistant Deputy Administrator for Nuclear Risk
Reduction in the Department of Energy’s National Nuclear Security Administration. In that
position he was responsible for major projects in Russia to permanently shut down their last
three weapons-grade plutonium-production reactors. He also worked with foreign governments
and international agencies to reduce the consequences of nuclear accidents by strengthening their
capability to respond to nuclear emergencies. Turner has also held several senior management
posts at DOE concerned with laboratory oversight and with nuclear safety and the safeguarding
of nuclear weapons both here and abroad.
He holds degrees in Physics from the Massachusetts Institute of Technology (Ph.D.) and Johns
Hopkins University (B.A.), and taught for five years as an Associate Professor of Physics and
Engineering at Morehouse College.

10 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

IT SHOWCASE PRESENTATION SCHEDULE AT A GLANCE
WEDNESDAY, AUGUST 14
Practice Session and Poster Setup
After Opening Session

THURSDAY, AUGUST 15
Poster Presentation Judging
8:30 a.m. - 10:00 a.m.
(Closed to Public)
Welcome and Introductions
Dr. Jesse Bemley, IT Showcase Manager
10:00 a.m. - 10:30 a.m.
Keynote Speaker Address
Shaneece Davis, IT Specialist
US Department of Health and human
Morning Presentations
10:30 a.m. - 12:00 p.m.
Lunch
12:00 p.m. - 1:25 p.m.
Afternoon Presentations
1:30pm – 3:00pm
Award Presentation
3:30 p.m. - 4:30 p.m.
Closing Remarks
Monique Berry, National BDPA President

11
KEYNOTE SPEAKER
Shaneece Davis is an IT Specialist at the Centers for
Medicare and Medicaid Services with a background
in project management and a passion for Information
Technology.
Shaneece has most recently earned her Masters of
Science degree of Information Management from the
University of Maryland at College Park and also
holds a Bachelors of Science degree in Computer
Information Systems from North Carolina Central
University.

SHANEECE DAVIS

Shaneece has been a past participant in the BDPA
Annual IT Showcase in 2010 and 2011 received 2nd place
in the competition for her research titled "Increasing
Active Learning among Students: NCCU's Introduction
of Virtual Computing Lab to grades K-12."

HHS
IT Specialist
The Centers for Medicare
& Medicaid Services

Shaneece has a special interest in pursuing different
research endeavors related to Health care, Information Science, and IT. She has been a student
member of the BDPA since 2010 and has mostly recently joined the GoldenKey Honour Society
and the Special Library Association (SLA) as of this year.
In her current position as an IT Special Shaneece assists in the oversight of various government
contracts with duties related to project management. Being a recent graduate and fairly new to
the workforce Shaneece hopes to be make a huge impact on the IT field someday.

Connect with Shaneece
Email: shaneecedavis@gmail.com
Facebook: facebook.com/ShaneeceSDavis
LinkedIn: linkedin.com/pub/shaneece-davis/33/953/1b1

12 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

HIGH SCHOOL PRESENTATIONS AT A GLANCE
C YBER S ECURITY AND THE T YPE OF H ACKERS I NVOLVED
Zinquarn Wright

N ETWORK S ECURITY E NGINEER : WHAT IS I T ?
Anthony Lawson

C YBER S ECURITY FOR P UBLIC WORKS AND U TILITIES
Cristal Sandoval

L AW AND F ORENSIC S CIENCE : THE TECHNOLOGICAL A SPECTS
Abdull Ali

“P ENETRATING THE C LOUD ”: E XPLORATION OF C LOUD C OMPUTING
Brandi Taylor

MOTION S ENSOR TECHNOLOGY IN V IDEO G AMING
Julian Anderson

F UZZY L OGIC
Tevon Eversley

C YCLOTRONS : F ROM THE L AB TO THE F IELD
Ian Crowder

THE F UTURE OF A IR T RAVEL
Jared Sherrod

13
MEET THE HIGH SCHOOL STUDENT PARTICIPANTS

ZINQUARN WRIGHT
McKinley Technology High School
Washington, DC
Zinquarn Wright is a junior at McKinley Technology High
School. He will major in music production in college. His
presentation is “Cyber Security and The Type of Hackers
Involved”.

ANTHONY LAWSON
Academy for Ideal Education
Washington, DC
Anthony Lawson is a junior at Academy for Ideal
Education. He expects to major in architecture upon
matriculation at the university. His presentation is
“Network Security Engineer: What is it?”

CRISTAL SANDOVAL
Thurgood Marshall Public Charter School
Washington, DC
Cristal Sandoval is a junior at Thurgood Marshall
High School. She has a keen interest in Science and
Math. Her presentation is “Cyber Security for Public
Works and Utilities”.

14 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

ABDULL ALI
Frank Ballou High School
Washington, DC
Abdull Ali is a junior at Ballou High School. His career
aspirations include veterinary and IT fields. His
presentation is “Law And Forensic Science: The
Technological Aspects”.

BRANDI TAYLOR
Irmo High School
Columbia, SC
Brandi Taylor is currently a 10th grade student at
Irmo High School. Brandi’s current career plan after
high school is to become a Nurse. Her ultimate motto
is “I can do anything if I put my mind to it.” Her
presentation is “Penetrating the Cloud”: Exploration of
Cloud computing”.

JULIAN ANDERSON
Oak Park River Forest High School
Oak Park, IL
Julian Anderson was born in Oak Park Illinois on
September 30th 1994. At the age of 2, he became the
youngest child to go to the Supreme Court. He is
fascinated at how technology impacts the lives of
everyone every day. He will be attending the
University of Nebraska in the fall. He doesn’t have a
major yet. However, he hopes to start a career where
technology is heavily involved. His presentation is
“Motion Sensor Technology in Video Gaming”.

15
TEVON EVERSLEY
Bedford Academy High School
Brooklyn, NY
Tevon Eversley is on track to receiving an advanced
Regents Diploma from Bedford Academy High
School. He would like to attend Massachusetts
Institute of Technology to study mechanical
engineering as well as architecture. His presentation is
“Fuzzy Logic”.

IAN CROWDER
Oak Park River Forest High School
Chicago, IL
Ian has pursued a galvanized interest in physics by
taking part in several scientific symposiums in which
he presented his research on particle, applied, and
atomic physics. Ian has recently graduated from Oak
Park River Forest High School and, this fall he will be
attending Iowa State University. His presentation is
“Cyclotrons: From the Lab to The Field”.

JARED SHERROD
Delaware Academy of Public Safety and Security
Newark, DE
Jared D. Sherrod is a fourteen year old freshman at
Delaware Academy of Public Safety and Security.
Jared is in his first year in high school where he
maintains a 3.0 grade point average and is trying to
achieve his goal to graduate as an Air Force Academy
Cadet. His presentation is “The Future of Air Travel”.

16 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

MOTION SENSOR TECHNOLOGY IN VIDEO GAMING
The Wave of The Future
JULIAN ANDERSON
Oak Park River Forest High School
Oak Park, IL
Motion Sensor Technology, has been revolutionizing the way we live for years. Motion Sensor
technology is not only changing the video game industry, but will be a key component in the way
technology is in the future.
Motion sensors have been used mainly in automotive and specialized industrial applications.
New applications and broader market opportunities for motion sensors have emerged since 2006
when the devices were first utilized in the Nintendo Wii console. The market for motion sensors
has now grown to include a much broader set of
applications including sporting equipment, home
appliances, industrial solutions, security monitoring
systems, medical rehabilitation and clinical devices.
Motion Sensing Technology is drastically changing the
video gaming industry.
The first motion sensor gaming technology was the
Power Glove it was originally released in 1989 by
Nintendo. The flex sensors in the Power Glove were

Power glove

carbon-based ink on plastic. Bending the flex sensors
caused the carbon to compress, decreasing the resistance. The sensors in the DataGlove were
based on optical fibers that were scratched near the bending joint that caused it to transmit less
light when bent, an innovation developed by Young L. Harvill of VPL Research. There were two
ultrasonic speakers (transmitters) in the glove and three ultrasonic microphones (receivers)
around the TV monitor.
The ultrasonic speakers take turns transmitting a short burst (a few pulses) of 40 kHz sound and
the system measures the time it takes for the sound to reach the microphones. A triangulation
calculation is performed to determine the X, Y, Z location of each of the two speakers, which
specifies the yaw and roll of the hand. The only dimension it cannot calculate is the pitch of the
hand, since the hand can pitch without moving the location of the two ultrasonic speakers. The
Power Glove was based on the patented technology of the VPL Dataglove, but with many
modifications that allow it to be used with slow hardware and sold at an affordable price. The
Dataglove can detect yaw, pitch, and roll uses fiber optic sensors to detect finger flexure and has
a resolution of 256 positions (8 bits) per finger for four fingers (the little finger is not measured to
save money, for it usually follows the movement of the ring finger), the Power Glove could only
detect roll, and uses sensors coated with conductive ink yielding a resolution of four positions (2
17
bits) per finger for four fingers. This allowed the Power Glove to store all the finger flexure
information in a single measured to save money, for it usually follows the movement of the ring
finger), the Power Glove could only detect roll, and uses sensors coated with conductive ink
yielding a resolution of four positions (2 bits) per finger for four fingers. This allowed the Power
Glove to store all the finger flexure information in a single byte. However, it appears that the
fingers actually feed an analog signal to the microprocessor on the Power Glove. The
microprocessor converts the analog signal into two bits per finger.
The technology has been greatly advanced by Nintendo, Sony and Microsoft since the Power
Glove. The current technology in the Wii Remote’s built-in optical sensor acts like a camera used
to locate the sensor bar’s infrared LEDs in its point of view. By plotting where two spots of light
fall (one from each end of the sensor bar), the Wii Console is able to determine where the remote
is pointing in relation to the screen. All sounds very clever – and it is – but it does have one
obvious drawback. A user, or indeed multiple users, would need to remain within the sensor
bar’s point of view in order for the system to work. In addition to knowing where a user is
pointing, the Wii Remote can also calculate how it’s being moved. This is done with the use of
accelerometers – tiny chips that feature a piece of silicon anchored at one end and allowed to
move at the other between an electric field created by a pair of capacitors. Accelerating the
remote in one direction causes the silicon to move and disrupt the electronic field. That change is
translated to motion, and the Wii Remote can measure acceleration on three axes – allowing for
the ability to perform a variety of gestures such as moving side to side, twisting, and pushing and
pulling. On top of all this, the data captured by the optical sensor and accelerometers needs to be
sent back to the Wii console without wires. In order to achieve that, the Wii remote contains a
built-in Bluetooth chip that allows for two-way communication with the console.

Xbox 360 Kinect
This one example of the strides that have been made in the development of Motion Sensing
technology. Motion Sensing Technology is being advanced to include 3D programming and a
variety of variations of sensors to improve and advance gaming. This technology not only will
improve gaming but will also have an impact on the way society moves forward in the future.
One of the products, the SMotion unit, contains a set of LEDs and is worn on the player’s belt. A
regular web camera tracks the beams of those LEDs and processes the data using algorithmic

18 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

software to determine the player’s body position in real time. It is reportedly ten times more
accurate and responds ten times faster than camera-only systems such as Kinect. Another device,
that can be used instead of SMotion, is the PMotion. It’s a platform that the player stands on,
which detects shifts in their center of gravity. Motion Sensors continue to be used by the auto
industry to assist with parking cars, security devices, digital cameras, and cell phones. Motion
Sensing Technology is developing at a rapid pace. This form of technology will soon be used in
our everyday lives to perform tasks that we normally do.
REFERENCES
Lomberg, Jason. "Electronic Component News." The Future of Motion-sensing Technology.
Electronic Component NEws, 14 Jan. 2010. Web. 16 Apr. 2013.
Website
Romero, Joshua J. "How Do Motion-Sensing Video Game Controllers Work?" How Do MotionSensing Video Game Controllers Work? (2006): n. pag. » Scienceline. Arthur L. Carter Journalism
Institute at New York University, 18 Dec. 2006. Web. 15 Apr. 2013.
Journal Article
Shen, Jessie. "Motion Sensor TechnologMotion Sensor Technology Moves towards Maturity with
Related Markets to Hit US$63.8 BillionMoves towards Maturity with Related Markets to Hit
US$63.8 Billion." Motion Sensor Technology Moves towards Maturity with Related Markets to
Hit US$63.8 Billion. DIGITIMES Inc., 14 Oct. 2010. Web. 15 Apr. 2013.
Talk, Niko. "How Motion Sensor Technology Is Revolutionizing Video Games." N.p., 01 Dec.
2010. Web. 8 Apr. 2013.
Website
"Video Games' Battle of Motion Sensor Technology." (2009): n. pag. Seeking Alpha, 3 June 2009.
Web. 16 Apr. 2013.
Journal Article

19
FUZZY LOGIC
TEVON EVERSLEY
Bedford Academy High School
Brooklyn, NY
Early Pioneers
Fuzzy logic is an approach to computing based on “degrees of truth” rather than the usual “true
or false” (1 or 0). Fuzzy logic is a binary coding system with extreme cases of 0 and 1’s or as
others may refer to it as extreme cases of truths (or “the state of matters” or “fact”). It also
includes the various states of truth in between so that, for example, the result of a comparison
between two things could be not “tall” or “short”. This has been a highly experimental
technology since the day of it’s discovery by Dr. Lotfi Zadeh of the University of California at
Berkley. Dr. Zadeh first stumbled upon this technology because he believed that it was possible
to create an artificial intelligence that would compute solutions to problems as well as everyday
conflicts based upon previous instances. The inventor of Fuzzy Logic Dr. Zadeh was an Iranian
scientist, who was born during the year of 1921 in Bakou in Azerbaijan Republic. His father was
also an Iranian and worked as a journalist, because of his father’s position they lived in Bakou.
His mother was a Russian and she worked as a doctor. This is an image of Dr. Zadeh below:

Shortly after Dr. Zadeh’s discovery of Fuzzy Logic he passed away. After this tragic event Fuzzy
Logic was put on hold for some years that is until late Professor ValiAllah Tahani took an
interest. After this event Fuzzy Logic then seemed to have a promising future once again as

20 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

research was then conducted in this area once again. This is an image of Professor ValiAllah
Tahani below:
In Depth View of Fuzzy Logic:
As mentioned previously Fuzzy Logic is a computing system based on degrees of truth. This may
sound fairly simply as in just entering the instances/information but it actually takes several
years to complete and is fairly complicated. First off the artificial intelligence system has to go
through a learning process in which it goes through a series of intense trials in which one would
have to teach the system and differentiate the truth from what is not the truth. Once this process
is conducted one would then have to put it through several test where it would have to put that
information to work to see whether or not it chose’s the correct outcomes based upon the
previous information put in. If it doesn’t choose the right outcome you must correct it and show
it the way until it is able to choose the right outcome every time. Dr. Zadeh first proposed this
idea during his work on the Fuzzy Logic Intelligence as well as natural language, this posse’s
difficulty in bringing it to terms of absolute 0 and 1s.

Complications with Fuzzy Logic:
Fuzzy Logic may be a promising field with vast capabilities but it posse’s its fair share of
difficulties as well. One of these problems is the fact that it isn’t reliable and the other is that it
doesn’t make decisions with emotional implications. The decisions that we as humans make are
often made based upon what we know as well as what we feel. Unlike artificial intelligence we
don’t make decisions based solely upon what we know. Therefore this then puts a hindrance on
the technology and puts it at a disadvantage. For this problem a unique solution would have to
be made up so that all problems are addressed properly.

Solution to Complications:
My proposition to solving this problem is to implement a Failure Mode an Effective Analysis
(FMEA) component into Fuzzy Logic. In theory this would correct any potential problems that
may occur. I plan to attack the emotional implication problem by putting the Fuzzy Logic
Technology to handle only situations that doesn’t deal with things that may involve morals
and/or emotions, therefore the technology wouldn’t be at a disadvantage.

In Depth View of Failure Mode and Effective Analysis:
Failure Mode and Effective Analysis was one of the first systematic techniques for correcting
failures within technologies. It was developed by reliability engineers in the 1950s to study
problems that might arise from malfunctions of military systems. A FMEA is often the first step
of a system reliability study. It involves reviewing as many components, assemblies, and
subsystems as possible to identify failure modes, and their causes and effects. For each
component, the failure modes and their resulting effects on the rest of the system are recorded in
a specific FMEA worksheet. There are numerous variations of such worksheets. An FMEA is an
inductive failure analysis and is a core. A successful FMEA activity helps to identify potential
21
failure modes based on experience with similar products/instances processes or based on
common physics of failure logic. Effects analysis refers to studying the consequences of those
failures on different system levels. Functional analyses are needed as an input to determine
correct failure modes. Failure probability can be reduced by understanding the failure
mechanism and reducing or eliminating the (root) causes and failure mechanism that may lead to
the failure (mode). It is therefore important to include the FMEA as it documents the causes of
failure. To put it into lemans terms failure mode and effective analysis is supposed to be a
corrective process in which the Artificial Intelligence corrects its mistakes based upon previous
instances/information.
Some of the benefits of this technology are:
It provides a documented method for selecting a design with a high probability of successful
operation and safety.
A documented uniform method of assessing potential failure mechanisms, failure modes and
their impact on system operation, resulting in a list of failure modes ranked according to the
seriousness of their system impact and likelihood of occurrence.
Early identification of single failure points (SFPS) and system interface problems, which may be
critical to mission success and/or safety.
They also provide a method of verifying that switching between redundant elements is not
jeopardized by postulated single failures.
An effective method for evaluating the effect of proposed changes to the design and/or
operational procedures on mission success and safety.
A basis for in-flight troubleshooting procedures and for locating performance monitoring and
fault-detection devices.
Criteria for early planning of tests.
An Example of What a FMEA Chart Looks Like:

22 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

With the introduction of a Failure Mode and an Effective Analysis as well as Fuzzy Logic the two
should be able to coexist with one another simultaneously. This effective analysis addition in
theory would log all possible failures with fuzzy logic. In doing so this would eliminate any
possible failures with the fuzzy logic system. Making it more stable as well as a valuable system.
Fuzzy logic is a viable product, as it would help to elevate some pressure on the energy
consumption, as it would automatically regulate uses. Such as the running time for light fixtures
such as lamps and etc.

Applications in Society/Early Uses:
Some of the earliest and most common uses of the Fuzzy Logic technologies are found in
relatively simple applications such as household appliances. These applications only scratch the
surface of the potential of the Fuzzy Logic technology in complex mechatronic systems. Some of
these household appliances include rice cockers as well as lighting systems. For instance with the
rice cooker they implement the fuzzy logic technology to help the process of cooking rice become
much easier than it already is. It does this through remembering/teaching it self how long on
average it usually takes to make a certain type of rice. It will go through this process for the first
couple times of use. After this in theory the rice cooker is supposed to shut off and on , on it’s
own based upon previous data as it should know the run time based upon previous
experiences/instances. Systems such as these are popular for the general public today as they sell
in major retail stores such as Macy’s and etc. The cheapest these systems usually run for is about
$120 dollars. This is an example to just show the capabilities of the Fuzzy Logic technologies and
how ineffectively we are selves are using it. This is a image of a Fuzzy Logic Rice Cooker below:

23
More in Depth View of the Rice Cooker:
Fuzzy-logic Rice Cookers have computer chips in which are
located inside of the rice cooker that direct their ability to
make proper adjustments to cooking time as well as to
temperature. Unlike basic rice cookers, which complete
tasks in a single-minded, mechanical manner. The process
for a fuzzy logic system is somewhat more complex. While
fuzzy-logic rice cookers function under the same premise as
basic models, their mathematical programming can deliver
a slew of customized cooking options. The trick behind
these capabilities is that the rice cooker can offer the ability to react, making precise fluctuations
in cooking time and temperature depending upon the program selected. These may include
different settings such as keeping-warm and quick-cooking cycles for the optimum cooking of
rice varieties like sushi rice, porridge rice, mixed rice, white rice, sweet rice and brown rice. Some
models also offer texture settings, allowing people to select hard or soft and sticky or wet rice.
Based upon your selection it will go through its database in which previous cooking times and
temperatures are stored, and then select the best combination for the type of rice you selected.
After this is done you will receive a outcome based upon your selection.
Renewable Energy:
Renewable energy has been an ever-growing field in the past years. In fact this was a pressing
concern during the presidential debate as the United States of America is one of top contributors
to fossil fuel burning today. Another key point is that we are behind in the Renewable energy
field. The main problem is that many people just don’t see it as a real viable resource. They say
why do today what we can do tomorrow but the real question is why do tomorrow what you can
do today. This saying influences many to make foolish decisions and put off the development
and advancements in the renewable energy field. This stigma on energy has changed as seen in
the chart below that investments in renewable energy have more than doubled in the projections
seen below:

24 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

As seen in the above image there is an ever-growing awareness in the renewable energy field,
more people are beginning to see its importance. As shown in the charts below the prices for
gasoline has not only increased but it is expected to continue increasing. This only emphasizes
the need for advancements to be made in the renewable energy field as cost will become a
pressing matter for future generations.

The main reason for the images above is to emphasize on the point of renewable energy. With the
introduction of the Fuzzy Logic System simultaneously with the Failure Mode and Effective
Analysis technology this will help to cut down on the United States consumption of energy. In
25
theory with the implementation of the Fuzzy artificial intelligence it’s suppose to factor in the run
time/usage office spaces and etc., then from there it will put together a schedule for exactly what
times power should be redirected to that room. During the times when the room is said to be
offline all power to that sector would cut off automatically on its own in order to conserve
energy.
Real Time Application Example:
A real time example of this would be street lamps as these appliances consume a great deal of
energy on a daily basis. First off the Fuzzy Logic System would evaluate the times in which the
area is most populated. From there it will create a run time schedule for the lamppost. It will also
take into account the time the sun rises as lampposts are not needed when natural light will be at
our disposal. Once this information is inputted and processed the Fuzzy Logic technology should
keep the lamp post on during normal run time hours but once it hits a time when the area isn’t as
populated the lights would then dim to conserve energy. Lastly when the sun begins to rise the
artificial intelligence (A.I) will cut of all power to that area/sector as the lamppost would no
longer be needed. This is a viable technology because it not only saves upon energy consumption
which equates to the consumption of money as well, it also saves upon the cost of hiring
employees as the artificial intelligence only requires a small group of employees to conduct
maintenance rather than a whole fleet of city employees to manage lighting for a building. This is
what makes the Fuzzy Logic Artificial Intelligence (A.I) a viable as well as valuable asset to
society today.

REFERENCES
http://asq.org/learn-about-quality/process-analysis-tools/overview/fmea.html
http://www.clemson.edu/ces/credo/classes/fmealect.pdf
http://www.ncbi.nlm.nih.gov/pubmed/21302802
http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol2/jp6/article2.html
http://www.control-systems-principles.co.uk/whitepapers/fuzzy-logic-systems.pdf
http://faculty.ksu.edu.sa/73586/Documents/research_12.pdf
http://www.renewableenergyworld.com/rea/news/article/2007/04/seeking-careers-in-therenewable-energy-field-47980
http://landartgenerator.org/LAGI-FieldGuideRenewableEnergy-ed1.pdf
http://www.nrel.gov/docs/fy01osti/28369.pdf
http://www.future-alternative-energy.net/renewable-energy-jobs.html
http://www.renewableenergyworld.com/rea/home
http://www.ucsusa.org/clean_energy/our-energy-choices/renewable-energy/public-benefitsof-renewable.html
http://www.temple.edu/lawschool/iilpp/environmentalarticles/wang%20mingyuan.pdf
http://www.energy.vt.edu/Publications/Incr_Use_Renew_Energy_VA_rev1.pdf
http://www.iea.org/aboutus/faqs/renewableenergy/
http://wi-consortium.org/wicweb/pdf/Zadeh.pdf

26 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

CYCLOTRONS: FROM THE LAB TO THE FIELD
IAN CROWDER
Oak Park River Forest High School
Chicago, IL
Beyond its theories and research, particle physics has gained applications. By moving this science
“from the lab to the field it now has practical uses for the real world. Cyclotrons are a prime
example of how a grandiose science can be focused onto more realistic uses.
In the past several decades, scientists’ understanding of physics has advanced exponentially.
From James Chadwick’s discovery of the neutron in 1932 to CERN’s finding of the Higg’s boson
in 2012, one of the key factors of this progress has been the development of particle accelerators.
Particle accelerators are usually used to study matter and energy. By using electromagnetic fields,
accelerators fire particles at an extraordinarily high velocity. All around the world, accelerators
are renowned for their operations, but not all of them are the same.
There are many different kinds of accelerator and each one has its own way of accelerating
particles. In a simpler method, there is the linear accelerator, like the one at Fermilab in the
United States, which fires particles in a straight line. Another model would be a synchrotron
accelerator, like at CERN in Switzerland, which bends the trajectory of fired particles into a
circular ring using electromagnets to direct and steer them on a bend. But, however well known
these models are, they both retain one major inconvenience: they take up too much space. The
largest particle accelerators can be measured for several
miles. However, there are smaller, more convenient
models. One of these models is called a cyclotron. A
cyclotron is so compact that it could easily slip inside
your pocket, and that is why it is such a revolutionary
technology.
In the 1920’s, physicist Ernest Lawrence thought that the
format of a linear accelerator was impractical for lighter
atomic particles, since it would need a vacuum tube
spanning the length of several meters to have adequate
acceleration. But this inconvenience inspired him to study
how one could use the same energy potential multiple
times instead of only once.

Ernest Lawrence

27
In the 1929, Lawrence got an idea while perusing
through a German electrical engineering journal that
belonged to a man named Rolf Wideröe. Wideröe
sketched a device that would allow someone to use the
same electrical potential twice. This could be done if
one were to double the energy by switching from
positive to negative potential in order to push ions and
then pull them. Lawrence thought to use magnetism to
bend charged particles into spiral trajectories and
therefore pass them through the same accelerating gap
over and over again. So he used a magnetic field to
curve charged particles through the same horizontal
plane in a vacuum over and over again. This made his
particle accelerator small and disc shaped which was

2-D Schematic of Cyclotron

more convenient than the long chamber of the linear
accelerators that were so common in his time.

A cyclotron is a circular accelerator except, unlike CERN’s piece of equipment; it is in the shape
of a spiral and not a ring. It is an accelerator in which subatomic particles (ex. Protons) are
charged in a gap at the center of the machine and are accelerated outward in a spiral trail. The
spiral lies on a plane that is perpendicular to a magnetic field which is used to bend the path of
the particles. Every time the particle reenters the gap during its spiral it slows down. So, to
prevent that, a high frequency square wave creates an electric field that accelerates the particle at
each passing of the gap.
The cyclotron is a very unique particle accelerator with several different uses. It can be attached
to particle colliders as a source for charged particles in a lab. When the cyclotrons propelled
particles are shot into a separate machine and they collide with each other within a closed space.
When this happens, the accelerator becomes a particle collider, and with this scientist can
measure the effects of the impact. The consequences of these impacts can help researchers infer
the properties of many sub-atomic particles.
It can also be used in the field of medicine as cancer treatment through the means proton
treatment source. One of the greatest benefits cyclotrons give to the medical feed is that they can
be used in proton therapy to treat cancer. This process involves accelerating protons to penetrate
the body and kill tumors through radiation damage, while minimizing damage to healthy tissue
during the procedure. An application like this is a perfect display of how cyclotrons are useful
even outside of the lab.
The cyclotron has been around for almost eighty-three years now and it is still used even today.
The invention of the cyclotron was revolutionary. This machine started as a tool to research
particles too small for the eyes to see and it seem almost irrelevant to the macroworld that we live
in but, over the years, it evolved to have a more direct impact on society through its contribution

28 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

to medicine. The convenience of such a compact yet powerful technology makes it a great
example of how the very specific topic of particle physics can be applied to everyday life.
REFERENCES
"A Science Odyssey: People and Discoveries: Lawrence invents the cyclotron." PBS: Public
Broadcasting Service. N.p., n.d. Web. 15 Apr. 2013.
<http://www.pbs.org/wgbh/aso/databank/entries/dp31cy.html>.
"An Early History of LBNL by Dr. Glenn T. Seaborg." Lawrence Berkeley National Laboratory.
N.p., n.d. Web. 15 Apr. 2013. <http://www.lbl.gov/LBL-PID/Nobelists/Seaborg/65thanniv/12.html>.
"Glossary Item - Cyclotron." Science Education at Jefferson Lab. N.p., n.d. Web. 15 Apr. 2013.
<http://education.jlab.org/glossary/cyclotron.html>.
"The First Cyclotrons - Ernest Lawrence and the Cyclotron: AIP History Center Web Exhibit." The
American Institute of Physics -- Physics Publications and Resources. N.p., n.d. Web. 15 Apr. 2013.
<http://www.aip.org/history/lawrence/first.htm>.
Yarris, Lynn, and LCYarris@lbl.gov. "Ernest Lawrence's Cyclotron." Lawrence Berkeley National
Laboratory. N.p., n.d. Web. 15 Apr. 2013. <http://www.lbl.gov/Science-Articles/Archive/earlyyears.html>.

29
THE FUTURE OF AIR TRAVEL
JARED SHERROD
Delaware Academy of Public Safety & Security High School
Newark, DE
ORIGIN OF COMMERCIAL TRAVEL
Today, Commercial flight is commonplace and expected. There is a large science behind the
design, development, and execution of large
commercial airplanes. However, when man began to
take flight the majority of the planes where built for
one or two passengers at the most. So when did
commercial flight become commonplace.
Commercial air travel had a humble beginning with a
short airboat flight from St Petersburg to Tampa
Florida. This St. Petersburg-Tampa Airboat flight
lasted approximately 23 minutes in total. The flight
only carried two passengers, but the passengers paid
for a trip on a publicly scheduled commercial flight. The Benoist XIV boat plane, used for the first
commercial flight, was considered at the time to be much safer than planes build to land on
runways.
After 1945 airline travel became a popular means of transportation starting the revolution
towards were the business is today. The progression
of the commercial airlines industry has driven
aeronautical engineering technology to its limits and
sometimes well beyond. I believe now and in the
future air travel will remain the best way to travel
long distances for business and pleasure, and I know
there is room for improvement of airline systems and
aircrafts to ensure continued progress of this
transportation sector.

MILITARY INFLUENCE ON THE AVIATION BUSINESS
After WWI the United States was swamped with aviators. The military was a big influence on the
aviation industry. In 1918 the United States Postal service was experimenting air mail with a
training plane from the army. Private operators were the first to fly the mail. The United States
Postal Service was the first to use private aviators to fly commercial routes. Eventually, due to
numerous accidents, the United States Army was tasked with mail delivery for the US Postal
Service.
30 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

In 1925 the ford motor company brought the scout aircraft company. Ford successfully made the
first airliner which was named Trimotor. With a twelve passenger seat capability the Trimotor
was a turning point for commercial aviation, because previous planes did not provide enough
capacity to support a profitable airline business.
After WWII aircraft manufacturers such as Boeing
and Lockheed Martin started making bigger aircraft
inspired by WWII bombers. They stripped out the
military gear and replaced it with seats, storage and
passenger comforts. This was successful tactics that
lead to an air travel boom. In the 1950s the airline
industry began to develop planes for air travel and
introduced jet and turboprop propulsion. This was
the start of the jet age with international flights
becoming very popular.
the same time period Juan Tripple began to build his own
airline in America. He achieved his goal with the creation of
pan American world airlines, Juan's fleet was made up of
flying boats, with routes los angles to shanghai and Boston
to London. Pan American and northwest airlines were the
only airlines to operate international routes in the 1930s.
This was due to the extraordinary aircraft the Boeing 247
and the Douglas dc-3 that made these companies very
profitable even through the great depression.
In 1970s the airline industry and planes had matured beyond any conceivable ideas back when
the first commercial flight took place in 1914. These
aircraft used turbine jet engines and new jumbo
wide bodies. This made it possible to seat four
hundred plus travelers on a 474 and fly them at 500
miles per hour to their destination. These planes
also revolutionized commercial shipping with the
capacity to carry literally almost anything by air.
Also the TU-144 and the Concorde made
supersonic flight between America to Europe a
reality. These planes bridge the gap between the two continents in half the time it takes for
conventional air travel.
Airplanes these days are technologically advanced with many features that would overwhelm
anyone researching them, but there is always room for improvements to make them better for the
customer and the bottom line. There are three main areas that typically can be improved upon;
Structure, Aerodynamics, and Propulsion.

31
The military has a major influence in cargo
planes, where they have been finding new
material and finding different types of structure
to make the plane more lightweight,
maneuverable and more fuel efficient. The
outcome of making these planes lightweight
with more space is the increased profit and a
lightweight body with less cost on fuel and
better gliding ability if a dire emergency were to
occur. With modifications to the plan to increase flight demand they also want to make travelers
want to travel and feel comfortable on your flight, first class is very little for the technology we
have now. First class should be the new business class and first class should be a whole around
entertainment system with bars, pool tables, and mini golf green all in the plane.
Flight propulsion is very important the most concern about fuel efficiency. Propulsion is the
process of forcing an object to move. The average airline company spends about 47,288.2 million
dollars on plane fuel a year. This is where most of their income goes to. There is a fuel that is
made from algae. This is a good fuel it good for the environment and can be used for the most
popular aircraft you see now. The alga is a good alternative because of its price. Even though
there is not a lot of this fuel produced it
is recyclable oil and it is very easy to
make. Also kerosene is a short supply.
This fuel can go very far in the long run.
Boeing has already started to use this
product for their test aircraft.
The main body of the airplane is called
the fuselage. The wings come out from
the fuselage at the center of gravity and
hold the airplanes weight during flight. Attached to the trailing edges of the wings are ailerons
which are control surfaces used to control the plane on the roll axis and the flaps which enable
the plane to easier descend on final approach. The wing may attach to the fuselage at the top
highwing or at the bottom low wing or even cut the fuselage mid-wing Some early airplanes had
both high and low wings bi-planes and in rare occasions all three tri-plane. Bi-planes usually only
had control surfaces on one of their sets of wings. In some designs the wing may not attach
directly to the fuselage at all instead attaching to the aircraft via stiff structural pieces.
The tail usually consists of three smaller fins two horizontal and
one vertical, called the horizontal and vertical stabilizers. Attached
to the trailing edge of the horizontal stabilizers are the elevators
these control surfaces are used to control the airplane on the pitch
axis. Attached to the trailing edge of the vertical stabilizer is the
rudder used to control the airplane on the yaw axis. Tails also
come in many different configurations T-tails put the horizontal
32 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

stabilizers at the very top of the vertical H tails have two smaller vertical stabilizers and V tails
have instead just two stabilizers with corresponding ruddervator control surfaces. A ruddervator
is a butterfly, or Vee, tail which combines the effect of the rudder and the elevators. The result is
movable surfaces that cause the aircraft to pitch up or down when the move together and deviate
from a straight course when they move differentially.
The engines may be mounted in many different places depending on the type of aircraft. Most
civilian single-engine airplanes have the engine and propeller
mounted on the front of the aircraft most light twins have them
mounted on the wings jets may have engines mounted on spars
below the wings or on the back section of the fuselage. Aircraft
have been made in many strange configurations however
sometimes the engine can be in the back with a pusher-prop
configuration mounted on a spar well above the airplane or in the
case of the Cessna Skymaster one mounted on the front and
another in back. Aircraft designers have always been keen to try new and strange ideas.
Landing gear typically comes in two types. Tricycle
and tail-dragger. Tricycle-gear aircraft have the
wheels configured just like a tricycle with one wheel
under the nose. Tail-daggers instead have a small
wheel on the tail. Airplanes may also be equipped
with floats instead of wheels for water landings or skis for snow and ice landings.
Additionally some airplanes are designed with a boat-shaped hull for water
landings.

Figure 1 Tail Dragger
Landing Gear

Aeronautical engineering in passenger aircraft is one of the hardest and confusing
jobs in the aircraft industry now. My plane would be future airplane to a whole another level. It
would have a jumbo jet style fuselage with room all around for comfort for tall people and room
all around, so no need to crunch up in tight spots. The engines would be on a tilt wing axis for
short take off abilities. With short take off abilities will make it possible to take off right from the
gate. In the pilots cockpit would have all touch technology for good accessibility to the pilot. This
would be a good resource but there would be a con in any failures where there would be another
set of non fiber optic controls if an emergency. This paper discussed the parts of passenger
aeronautical engineering and I hope I informed you about the many topics of this research paper.
REFERENCES
www.boeing.com/.../05/corp_innovative_thinking_05_07_12.html
http://www.answers.com/topic/ruddervators
http://www.answers.com/topic/ruddervators#ixzz2Ti3h77K6
http://travelforaircraft.wordpress.com/2011/04/13/early-flying-boat-early-airliner-benoist-xivairboat/
http://careers.stateuniversity.com/pages/818/Flight-Engineer.html
33
UNDERGRADUATE PRESENTATIONS AT A GLANCE
N EURAL N ETWORKS
Nicholas Reid

S ECURING B IG D ATA
Alex Boardley

TERRAIN C LASSIFICATION
Jazzmine Bess

THE E XPANSION OF C YBERSECURITY
Avery Sherrod

THE TRUTH ABOUT B IOARTIFICIAL O RGANS
Jessica Boardley

MODERN W EARABLE T ECHNOLOGIES
Michael Bijou

C LOUD T ECHNOLOGY
Bryan C. Bemley and Chauncey Miller

34 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

MEET THE UNDERGRADUATE PARTICIPANTS

NICHOLAS REID
Baruch College
Brooklyn, NY
A 2011 Graduate from Medgar Evers College Preparatory
High School, Nicholas Reid majored in math and science
and received an Advanced Regent’s High School Diploma.
Nicholas has been accepted into the fall 2013 semester at
Polytechnic Institute at New York University where he
will be majoring in Physics and minoring in Computer
Technology. His presentation is “Neural Networks”

ALEX BOARDLEY
Wilmington University
Wilmington, DE
Alex Boardley is a sophomore senior at Wilmington
University majoring in Computer Network and Security.
His presentation is “Securing Big Data”.

JAZZMINE BESS
Florida A&M University
Tallahassee, FL
Jazzmine Bess is a fourth year computer information
systems student. She will be graduating December 2013
magna cum laude. Her presentation is “Terrain
Classification”.

35
AVERY SHERROD
Delaware Technical and Community College
New Castle, DE
Avery Sherrod is currently attending Delaware Technical
and Community College where he holds a 3.2 overall
grade point average and is on target to graduate with an
Associates of Science degree in Computer Information
Systems May 2013. His presentation is “The Expansion of
Cyber Security”.

JESSICA BOARDLEY
University of Delaware
Newark, DE
Jessica Jean Boardley is freshman at the University of
Delaware in Newark Delaware. She plans on becoming an
officer and Mental Health Nurse in the Air Force. Her
presentation is “The Truth About Bioartifical Organs”.

MICHAEL BIJOU
Bowie State University
Bowie, MD
Michael Bijou is an aspiring computer scientist from
Arlington, Virginia. He is currently attending Bowie State
University to obtain a bachelor's degree in computer
science. His presentation is “Modern Wearable
Technologies”.

36 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

BRYAN BEMLEY
Bowie State University
Bowie, MD
Bryan Bemley is currently attending Bowie State
University in Bowie Maryland. He is currently doing
research in 3D modeling and animation, graphic arts and
design, web design and development, mobile application
development and high performance computing with a
slight focus on visualization. Some of his future goals are
to start his own 3D animation, web-development and
research company and to develop user applications that
can make the use of technology better for everyday life.
His presentation is “Crash Course Cloud Computing”.

37
NEURAL NETWORKS
NICHOLAS REID
Baruch College
Brooklyn, NY
As humans, we are naturally innovative; we find new ways to do old things, or create new things
to help ease or help us perform task in life be it cooking, driving, or even diagnosing someone in
a doctor’s office. As we grow, technology grows and it is becoming more prevalent in the medical
field. There has been a sudden boom of Artificial Intelligent systems being applied in the medical
area. This paper will examine Neural Networks. In particular, my focus will be on how Neural
Networks relate to the cardiac medical field, and how they are used to help diagnose heart
arrhythmias..
What are neural networks? As defined by dictionary.com neural networks are, “Also called
neural nets. A computer model designed to simulate the behavior of biological neural networks,
as in pattern recognition, language processing, and problem solving, with the goal of selfdirected information processing.” Neural networks can be split into two sub categories Biological
Neural Networks (BNNs) and Artificial Neural Networks (ANNs). To understand how ANNs
work we first need to understand how BNNs work or how our brains process information, learn
new things. The human brain is made up of cells known as neurons. Neurons are basically
composed of 3 things: dendrites, axon and a cell body. The Dendrites are little finger like
projections, located at the head of the neuron that takes electrical inputs from other neurons. The
cell body, mostly called the soma, is where the nucleus of the neuron is, and the axon is the
usually long process of a nerve fiber that generally carries impulses away from the body of the
nerve cell. At the end of the axon, it splits into tiny branches, at the end of each branch is a
structure called a synapse. The synapse converts the activity in the axon into electrical impulses
that inhibit or excite activity from the axon to electrical impulses that inhibit or excite the
neighboring connected neurons by synaptic connections from the end of one neuron to the
dendrite base of another. When a neuron receives excitatory input that is sufficiently large
compared with its inhibitory input, it sends a spike of electrical activity down its axon. Basically,
a biological neuron receives inputs from other sources, combines them in some way, performs a
generally nonlinear operation on the result, and then output the final result. Learning occurs by
changing the effectiveness of the synapses so that the influence of one neuron on another
changes. In other words, the brain has developed ways for neurons to change their response to
new stimulus patterns so that similar events may affect future responses (self-adaptation,
learning).
ANNs are an attempt to emulate the brain. Their architecture is designed analogous to how
information is processed in our brains. ANNs contain many processors or units known also as
neurons; the units are interconnected with each other. A neuron in an ANN is a device that
carries out all the basic functions of a biological neuron; however, it is much simpler than a bio
38 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

neuron. Each Artificial neuron receives a set number of inputs xn from either external factors or
other neuron, and a connection weight wn the inputs are multiplied by the connection weights
and summed S. The result of this summation S is the input into transfer function F(S) and the
output of this function is the result. While most ANN’s are made from the basic unit some differ
by varying these units.

The design of a neural network is a very complex and arduous process. The developer must go
through a sort of trial and error phase in choosing a list or design decisions before picking a
satisfactory design. The issues that occur in this phase are major concerns for ANN developers.
Designing an ANN consists of: Putting the neurons in different layers, choosing the type of
connections for inter (different layers) or intra (same layer) connected neurons, deciding how a
neuron will receive an input and produce an output, and determining the strength of the
connection weights by letting the network adjust itself using a training set.
BNN’s are constructed in 3-Dimensional ways, made from very small components. These
neurons seem to be able to inter-connect to each other limitlessly; however, in ANNs that is not
the case nor is that possible…yet. ANNs are simple clustering of artificial neurons which are
done by creating layers, which are then connected to each other. Mostly, all ANN’s have the same
architecture. Some neurons interface with the world to take in its input while others interface
with the world to output info, and others are hidden from us.
Neurons are grouped into layers. The input layer is where neurons take input from the external
environment-the world, the output layer consist, of neurons that communicate the result of the
network to the external environment, and there are one, two, or more hidden layers in between
these two layers.

39
When the input layer neurons receive input its neurons output a result which become the input
to other layers of the network; this process reiterates itself until a condition is met or the output
layer is called. To measure how many hidden layers a network should have to function at its
peak, developers leave that to trial and error. If you have too many hidden layers your network
will develop over fit were if the network is given a training set instead of learning a pattern and
deducting an answer the network will simply memorize the training set rendering that net
useless.
Neurons are connected to each other by a number of paths. These paths can be in most cases
unidirectional or rarely a two way connection between two neurons. A neuron may communicate
with neurons in its same layer or with neurons in a different layer both processed by different
means. We can split up ANN communication into 2 sections inter-layer and intra-layer.
Inter-layer is when neurons from different layers communicate with each other. There are
different types of inter-layer communication, the first of which is called fully connected. Fully
connected, each neuron on the first layer is connected to every neuron on the second layer. Next,
Partially Connected, Each neuron on the first layer does not have to be connected to a neuron on
the 2nd layer. Feed forward: The neurons on the first layer send their outputs to the second layer,
but do not receive any input back. Bi-Directional, there are other sets of connections carrying
outputs from the 2nd layer to the 1st. Hierarchical, the neurons of a lower layer can only
communicate with the neurons from the layer exactly above it. Resonance: The layers have bidirectional connections, and they can continue sending messages across the connections a
number of times until a certain condition is achieved.
In more complex ANN structures, neurons communicate amongst themselves within a layer; this
is intra-layer communication. There are two types of intra-layer communication Recurrent and
On-center/Off Surround. Recurrent is the neurons within a layer are fully or partially connected
to one another. After these neurons receive input from another layer, they reiterate their outputs
with each before they are allowed to send their outputs to another layer. Generally, some
conditions among the neurons of the layer should be achieved before they communicate their
outputs to another layer. On-center/Off Surround is when a neuron within a layer has excitatory
connections to itself and its immediate neighbors, and has inhibitory connections to other
neurons.

40 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

Learning in BNN’s basically occurs by experience, the same with ANN’s. Sometimes called
machine training algorithms, because of their changing connection weights, causes the network
to learn the solution to a problem. The learning ability of a net depends on its type of architecture
and what algorithmic method is used to train it. The training of nets usually consist of one of
these 3 methods, Unsupervised learning, Reinforcement learning, Back propagation.
Unsupervised learning is when the hidden layer of neurons must organize themselves without
any help from the outside (the developer). In this method, the net does not have any sample
outputs to which it can compare its outputs. This method is often coined by the phrase learning
by doing. Reinforcement learning is the neurons in the hidden layer are randomly shuffled then
reshuffled when its told how close it is to solving the problem. Reinforcement learning is also
called supervised learning because it requires a so-called teacher which can be a training set of
data or an observer who grades the nets performance. Back propagation is the net is not just
given reinforcement on how its solving the problem, information about its errors are sent back
through the net and is then used to adjust the connection weights between layers so as to
improve performance.
ANN learning can also be classified as being On-line or Off-line. In the off-line learning methods,
once the systems enters into the operation mode, its weights are fixed and do not change any
more. Most of the networks are of the off-line learning type. In on-line or real time learning, when
the system is in operating mode (recall), it continues to learn while being used as a decision tool.
This type of learning has a more complex design structure.
In ANN’s, there are a variety of algorithms that are used to update connection weights. These
algorithms are known as learning laws. A few of the major learning laws are Hebb’s Rule,
Hopfield Law, The Delta Rule, and Kohonen’s Learning Law. Hebb’s Rule is the first and most
known learning law created by Donald Hebb. The rule is: If a neuron receives an input from
another neuron and if both are highly active (mathematically have the same sign), the weight
between the neurons should be strengthened. Hopfield Law states,” if the desired output and the
input are both active or both inactive, increment the connection weight by the learning rate,
otherwise decrement the weight by the learning rate.” The Delta Rule (TDR) is another variant of
Hebb’s Rule. TDR works by continually modifying the connection strengths in the net to reduce
the difference between the desired output value and the actual output value. Kohonen’s Law,
This procedure, developed by Teuvo Kohonen, was inspired by learning in biological systems. In
this procedure, the neurons compete for the opportunity to learn, or to update their weights. The
processing neuron with the largest output is declared the winner and has the capability of
inhibiting its competitors, as well as, exciting its neighbors. Only the winner is permitted output,
and only the winner plus its neighbors are allowed to update their connection weights.
The history of neural networks date back to 1943 where neurophysiologist Warren McCulloch
and mathematician, Walter Pitts wrote a paper on how neurons might work. In order to describe
how neurons in the brain might work, they modeled a simple neural network using electrical
circuits. After, in 1949 Donald Hebb wrote “The Organization of Behavior”, a work which
pointed out the fact that neural pathways are strengthened each time they are used, a concept
41
fundamentally essential to the ways in which humans learn. If two nerves fire at the same time,
he argued, the connection between them is enhanced. As computers became more advanced in
the 1950's, it was finally possible to simulate a hypothetical neural network.
Moreover in 1959, Bernard Widrow and Marcian Hoff of Stanford developed models called
"ADALINE" and "MADALINE." the names come from their use of Multiple ADAptive LINear
Elements. ADALINE was developed to recognize binary patterns so that if it was reading
streaming bits from a phone line, it could predict the next bit. MADALINE was the first neural
network applied to a real world problem, using an adaptive filter that eliminates echoes on
phone lines. In 1962, Widrow & Hoff developed a learning procedure that examines the
activation value before it is adjusted. It is based on the idea that while one active neuron may
have a big error, one can adjust the activation values to distribute it across the network, or at least
to adjacent neurons. Despite the later success of the neural network, traditional von Neumann
architecture took over the computing scene, and neural research was left behind. In the same time
period, a paper was written that suggested there could not be an extension from the single
layered neural network to a multiple layered neural network. In addition, many people in the
field were using a learning function that was fundamentally flawed because it was not
differentiable across the entire line. As a result, research and funding went drastically down.
In 1982, interest in the field was renewed. John Hopfield of Caltech presented a paper to the
National Academy of Sciences. His approach was to create more useful machines by using
bidirectional lines. Previously, the connections between neurons were only one way. That same
year, Reilly and Cooper used a "Hybrid network" with multiple layers, each layer using a
different problem-solving strategy. Additionally in 1982, there was a joint United States - Japan
conference on Cooperative/Competitive Neural Networks. Japan announced a new Fifth
Generation effort on neural networks, and US papers generated worry that the US could be left
behind in the field. By 5th generation computing Artificial Intelligence was being used.
In 1986, with multiple layered neural networks in the news, the problem was how to extend the
Widrow-Hoff rule to multiple layers. Three independent groups of researchers, one of which
included David Rumelhart, a former member of Stanford's psychology department, came up
with similar ideas which are now called back propagation networks because it distributes pattern
recognition errors throughout the network. Hybrid networks used just two layers, these backpropagation networks use many. The result is that back-propagation networks are "slow
learners," needing possibly thousands of iterations to learn.
Now that we have a general grasp on what an ANN is, I will explain how they are used in the
cardiac medical field, more specifically, in ECG analysis. First what is, an Electrocardiogram or
ECG, as defined by mayoclinic.com is,” An electrocardiogram is used to monitor your heart. Each
beat of your heart is triggered by an electrical impulse normally generated from special cells in
the upper right chamber of your heart. An electrocardiogram — also called an ECG or EKG —
records these electrical signals as they travel through your heart.” ECG examination has been
used to diagnose cardiovascular disease. In many cases such as, Intensive Care Unit (ICU) or the

42 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

Holter System, recording and analyzing ECG patterns are necessary and automation of it can
help reduce the physicians work and improve the efficiency of diagnosing.
Typical ECG’s waves contain the P wave, QRS complex, and T wave in each heartbeat.
Recognizing an ECG pattern is essentially the process of extracting and classifying ECG feature
parameters, which may be obtained either from the time domain or transform domain. The
features being frequently used for ECG analysis in time domain include the wave shape,
amplitude, duration, areas, and R-R intervals.
The problem with automating ECG is 3 things; the first is the non-linearity in ECG signals.
Basically, the haphazardness and unpredictability in an ECG. The second is the large variation of
anatomies. The third is how ECG’s are contaminated by background noises (signal interference)
like electrode motion artifact and electromyogram-induced noise, which also add to the difficulty
of automatic ECG pattern recognition. With all of these issues, ANN’s are the perfect solution to
these problems. ANN’s are branded by their robustness and non-linearity, which is why it’s great
at dealing with non-linear problems and good easiness when dealing with noise. ANN’s are also
characterized by their adaptive abilities and recognition speed which is fast. A good example of
how ANN’s are used in ECG automation is a study done by Lin He, Wensheng Hou, Xiaolin
Zhen and Chenglin Peng at the Biomedical Engineering College of Chongqing University. In this
study they chose 4 types of ECG patterns taken from the Massachusetts Institute of Technology –
Beth Israel Hospital (MIT-BIH) Arrhythmia Database, and created 3 different neural networks
SOM (Self-Organization Map), BP (Back propagation), and LVQ (Leaning Vector Quantization) to
recognize the ECG patterns.
The SOM ANN is an unsupervised self-learning neural network, which was developed by Teuvo
Kohonen in 1981. A SOM net consists of one input layer and one competitive layer. The BP net
was a supervised multilayer feed-forward net. The LVQ net is composed of one input layer, one
hidden layer and one output layer. The neurons in the hidden layer and output layer only
produce binary outputs. After training the nets and recording the results when they were put
into action the overall accuracy for each net in its analysis was over 90%, the SOM net was the
most accurate with an overall accuracy of 95.5%. The point of the study was to prove the
effectiveness in neural networks being able to recognize ECG patterns and actually help doctors
with their diagnoses, there are many other studies conducted at different institutions like, “Atrial
fibrillation classification with artificial neural networks” by Sadik Kara, and Mustafa Okandan at
43
Erciyes University,, “Classifying Multichannel ECG Patterns with an Adaptive Neural Network”
by S. Barro, M. Fernandez-Delgado, J.A. Vila-Sobrino, C. V. Regueiro and E. Sanchez at
University of Santiago de Compostela, etc… all these studies prove neural networks are
beneficial to the cardiology field.
In the future, I believe neural network applications will become an essential part of a cardiologist,
“toolbox” as well within all ICUs at hospitals, their ability to interpret and recognize ECG
patterns, help diagnose heart disease, predict myocardial infarctions, and are going to be an
integral part in helping prolong mortality in people with or without cardiovascular
complications.
In conclusion, there has been a sudden rise in the use of neural networks in the medical field. In
this paper we discussed what neural nets are, how they help improve cardiovascular health
keying in on ECG pattern recognition and analysis, and if they have a foreseeable future in the
cardio-medical field.

REFERENCES
Kara, Sadik, and Mustafa Okandan. "Atrial fibrillation Classification with Artificial Neural
Networks." Pattern Recognition (n.d.): n. pag. 5 Mar. 2007. Web.
Barro, S., M. Fernandez-Delgado, J.A. Vila-Sobrino, C.V. Regeuiro, and E. Sanchez. "Classifying
Multichannel ECG Patterns with an Adaptive Neural Network." IEEE ENGINEERING IN
MEDICINE AND BIOLOGY (1998): n. pag. Jan.-Feb. 1998. Web.
He, Lin, Wensheng Hou, Xiaolin Zhen, and Chenglin Peng. "Recognition of ECG Patterns Using
Artificial Neural Network." The IEEE Computer Society, n.d. Web.
Klerfors, Daniel. Artificial Neural Networks. Rep. N.p.: n.p., n.d. Print.
"Electrocardiogram (ECG or EKG) - MayoClinic.com." Mayo Clinic. N.p., n.d. Web. 18 May 2013.
<http://www.mayoclinic.com/health/electrocardiogram/MY00086>.
"Neural Networks - History." WWW-CS-FACULTY & STAFF Home Page (12-Apr-1995). N.p.,
n.d. Web. 18 May 2013. <http://www-csfaculty.stanford.edu/~eroberts/courses/soco/projects/neuralnetworks/History/history1.html>.

44 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

SECURING BIG DATA
ALEX BOARDLEY
Wilmington University
Wilmington, DE
WHAT IS BIG DATA?
The definition is “a collection of data sets so
large and complex that it becomes difficult to
process using on-hand database management
tools or traditional data processing applications”
According to SAS, Big data is a popular term
used to describe the exponential growth,
availability and use of information, both
structured and unstructured (SAS). What does
that mean? The volume of data that is being
collected is a challenge to software applications.
Software cannot obtain and process the data in a
way we can use it efficiently and timely. So what
companies uses this much data?
Ever hear of Software AG, Oracle Corporation, IBM, Microsoft, SAP, EMC, and HP because each
one of them has spent 15 billion on software firms only specializing in data management and
analytics. This is all for a good reason, holding, using and keeping this data safe can be the
difference between a thriving industry and an unsuccessful business. Analytics is important to
invest in because it can give you the competitive edge against your competitors. A person in
analytics uncovers hidden patterns in the big data, and try’s to pick up useful trends to help the
company, all in all increasing revenue.
VOLUME, VELOCITY AND VARIETY
You may know there are 3 “V”s to big
data, Volume, velocity, and variety.
Volume meaning that it is almost
impossible to understand the shear
amount of data, velocity meaning data
comes in fast and furious, and variety
meaning data comes in all forms ,
structured un-structured and semistructured. This idea was created by

45
Gartner, Inc. back in 2001. For a better understanding Gartner tweeted this, “Big data” is highvolume, -velocity and -variety information assets that demand cost-effective, innovative forms of
information processing for enhanced insight and decision making. The most interesting of the 3vs
is variety. The ever growing volume of data attributed to years of data collection or the
increasing use of text message from increase participation in social media, is contributing to the
storage issues of today. How does one determine what data is relevant when looking at such a
vast amount of data.
Factors that contribute to the increase of data volume includes the constant streaming of data
from social media. There are many family arguments based on the over use of a family data plan.
This is just one example of the daily over use of data. What makes any of this data relevant? In
many cases all of the data needs to be stored and protected without setting a priority on one set
of data over another. The data is received in a variety of formats. This includes video, photos,
emails, documents and a host of other unique files. Velocity is represented by the massive
amounts of data that gets generated on a dialy basis. RFID tags are an example of solutions that
contribute to the increasing velocity of data.

WHY HACKERS PLANT VIRUSES?

The bulk of all this data that’s out there is already owned by
organizations, but it is unused. This data is called dark data.
For example elevator logs help to predict vacated real estate.
The organizations own this data, but it will remain unused.
What are the reason hackers are infecting our computers? :
Small businesses are ripe targets from hackers. See the chart
to the left for more detail. It shows the distribution of attacks
by company size, as reported in( Verizon's 2012 Data Breach
Report). The results are staggering. Bad hackers are largely
at war with small businesses: 70% of all attacks were
perpetrated against tiny businesses with 100 employees or
less.
To gain access to an organization most valuable property,
hackers are only getting better. A hacker is someone who
seeks and exploits weaknesses in a computer system or computer network. Hackers can be
motivated by many different reasons, including profit, protest, or challenge.
The lure of financial gain has motivated cybercriminals to implement innovative new methods
and to become more thorough with each passing year. Today, cybercriminals are sophisticated,
evolving their craft and tools in real time. For example, malware created today often undergoes
quality control procedures. Cybercriminals test it on numerous machines and operating systems
to ensure it bypasses detection. Spammers, criminal syndicates, and other hostile entities are
46 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

switching their attention from conventional high-volume spam to carefully crafted, low-volume
attacks that are more sophisticated, devious, and potentially costly. Instead of peddling
counterfeit drugs or luxury goods, they’re launching phishing attacks, distributing email with
fake links or malware attachments (such as keyloggers and rootkits) that enable criminals at
remote locations to surreptitiously siphon or “infiltrate” valuable business data from an
enterprise network. Hacking can even go as far as international. Considered to a hacker, we are
all next door. The New York Times' announcement serious possibilities that China's military has
stolen information from companies "involved in the critical infrastructure of the United States -its electrical power grid, gas lines and waterworks."

VIRUSES AND VULNERABILITIES
In the 1990s, the average personal computer user received one or two spam messages a day. As of
August 2010, the amount of spam was estimated to
be around 200 billion spam messages sent per day.
Today there are nine major types of computer
viruses that infect millions of computers every day.
These viruses include, Macro Viruses, Boot Sector
Viruses, Browser Hijacker, Direct Action Viruses,
File Infector Viruses, Multipartite Viruses,
Polymorphic Viruses, Resident Viruses and Web
Scripting Viruses. They can be as simple as
spamming your computer to deleting all of your files and stealing you’re your personal
information like social security numbers and credit card numbers for their
own personal uses.
The reasons why they are called viruses are because they have the ability
to replicate. Many viruses, which disguise themselves as tracking cookies,
are meant to allow access to personal information that you give out over
the internet. This is why it is good to always make the time to delete all the
cookies in your history. When you are purchasing items from online,
especially from non-recommended sites, there is the risk for identity theft. Viruses can also slow
down your computer expressively, erasing information, destroying vital data, and even shutting
down your computer.
The risk is equally as present on a mobile device like a cell phone or tablet. Thanks to online,
mobile transactions, social media trafficking and GPS coordinates we now generate over 2.5
quintillion bytes a day. It is predicted to increase by an outstanding 100% by 2015. Discovering
that your intimate conversations, pictures or texts have been splattered across the Internet for all
to see is not only an invasion of privacy, but can also be damaging to your personal life and
livelihood. It is important that you don’t give out any important information over text messaging
even to somebody you trust. You never know what your relationship will be with that person, if
you must be prepared to change your passwords immediately. Also make sure you set a
password to your voice mail. Most phones come with a default code to listen to your voice mail
47
and that can be obtained from other phones. Lastly watch
out for the applications you download on your device.
Malware, which can still personal information, can be
hidden within applications, more popular on android
phones and jail broken apple products how can we prevent
our computers from being hacked? Make sure you keep up
to date on your software updates. Install firewall on your
computer. Change your password every month. Purchase
or download anti-virus software. Install anti software
programs. Last make sure you delete emails from
unknown sources.
Now imagine a hacker having access to terabytes and
terabytes of other people’s personal information and that’s
where we have a problem. If companies possess big data
they need to possess efficient ways to keep it safe. For large companies to keep big data safe they
follow a multitude of steps. When the data is collected by the organization it is then sent to their
cloud. An off site area were they store there big data. Every organization, for safety, should have
another site to back up all of the information they have just in case of complication or system
compromise. It may cost more money but you would save more in the long run.

DATA STORAGE
The data is stored in storage tiers. Tier 1 being the most important and efficient storage device
because that piece of data is used most. It host mission critical servers and computer systems,
with fully redundant subsystems (cooling, power, network links, storage etc) and
compartmentalized security zones controlled by biometric access controls methods. Tier 2 being a
little less important and tier 3 being the least with minimum checkups.
How can companies that have big data keep it safe? With help from companies like Proof point
security. Fortunately, there is a new approach to threat detection and remediation. Instead of
pattern-matching against “known bad” email, Proofpoint’s security-as-a-service platform applies
Big Data analysis techniques to continuously analyze billions of email messages and everchanging patterns of communication, enabling Proofpoint’s security-as-a-service platform to
detect anomalous behavior

TYPES OF BIG DATA SECURITY
There are two different forms of big data security: Real-time big data security analytics solutions
and Asymmetric big data security analytics solutions. Real- time solutions are built around a
distributed architecture; made up of appliances designed for local streaming processing and
collective parallel processing. Real-time big data security analytics solutions tend to collect and
analyze old standby data like logs, network flows, and IP packets across the enterprise with a
view of the data from L2 through L7. Asymmetric big data security analytics solutions can be

48 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

built on proprietary data repositories, but it is likely that all products will support big data
technologies like Cassandra, Hadoop, and NoSQL over time.
Security analysts will feed these solutions with batch updates containing terabytes of structured
and unstructured data in order to look at historical security trends over long periods of time.
Asymmetric big data security solutions will be anchored by machine learning algorithms, cluster
analysis, and advanced visualization.
BIG DATA MANAGEMENT
There is some good news regarding the management of big data. Spam volumes are declining.
They have dropped 68% year-over-year between February 2011 and February 2012 In a 2011
year-end report, IBM estimated that volumes had returned to their 2008 levels. Other estimates
suggest far more dramatic declines: from an all-time daily high of 225 billion messages down to
25 billion—a drop of nearly 10 times to what it was.
This is the 21st century, everything fun or beneficial runs off of ones and zeros. In fact 90% of the
data ever used has been created in the past two years (Conner Ch. 3). Thanks to innovations of
creating “the cloud” it has given attacks a bigger playing field to attack putting everybody else at
a much higher risk. America is being attacked more and more by other nations.
The biggest problem is “the attacks are changing from intrusion attacks to disruptive attacks”
(Kerner ch.13). large industries need to be secure and be prepared for any cyber attack. Last
month, a malware-infected site appeared to be the root cause of some high-profile hacking:
Facebook, Microsoft, Twitter, and Apple were all compromised in the attack. Now, The Security
Ledger has learned that the one infected site was not the only cause of these attacks — at least
two other mobile application development sites were also attacked, and an undisclosed number
of other, non-dev focused sites were part of the attack as well. The victims of the attacks also
reportedly went well beyond just the technology segment — auto manufacturers, US government
agencies, and a high-profile candy maker were also hit in the attack.
Protecting Big Data archives presents an additional challenge based on the massive amount of
data. A recurring backup is just not feasible. The massive amounts of data including videos and
photos can go years without being used at all. The occasional query would only touch small
components of the massive data source. The challenge is in the data size and the complex access
pattern. As a result a disk only storage solution may not be practical. Object-based storage using
cloud infrastructure could be the solution as these vendors have eliminated the key scaling issue
represented in traditional file systems. The solution includes establishing a new-infinite disk
based storage system that gets backed up to another disk based, near infinite capacity system.
The new issue becomes the cost, which could be addressed with a tape backup.
SECURING BIG DATA
Securing data is one of the most important criteria to managing a big data environment, but it can
also be difficult as a result of the magnitude and complexity of the data. The first step toward
49
securing big data is to understand the storage patterns and how the data will be used. The
concept of Big Data is not going away any time soon. If anything it will get bigger. Big Data is
important to both the Business and the Technology side of any corporation, resulting in a perfect
marriage. As recommended on Wired Innovative Insights there are five steps you can take to
secure your big data.
First assess your data storage environment and understand usage patterns. This will help you
know where information is most vulnerable. Make sure that the data cluster is secured against
attacks. The next step would be the keep controls close to the data source will create a more
effective line of defense. The third step would to be have embedded security in the data center
cluster in the form of access control such as RBAC. This way if the perimeter is breached, the
cluster and sensitive data is still protected by a security wrapper. Encrypt both static and in
transit data creating a continuous stat of motion as it travels from node to node. Lastly,
incorporate a logging function into the cluster as mandated by most regulatory compliances for
auditing purposes.
REFERENCES
Crump, George. How to Protect the Big Data Archive. Information Week. Received Apr 2013
www.informationweek.com/storage/disaster-recovery/how-to-protect-the-big-dataarchive/232901095
Big Data – What is It?. SAS Received Apr 2013 http://www.sas.com/big-data/
Krikken, Ramon. “Securing Big Data “ - The Newest Fad. Gartner
http://blogs.gartner.com/ramon-krikken/2012/05/10/securing-big-data-the-newest-fad/
Kossman, Rachel. Understanding Primary Data Storage in the Cloud. Intel SearchCloudStorage
http://searchcloudstorage.techtarget.com/podcast/Understanding-primary-data-storage-in-thecloud
Vogt, Jim. Securing Big Data’s Future. Wired.
http://searchcloudstorage.techtarget.com/podcast/Understanding-primary-data-storage-in-thecloud
Hurwitz J., Nugent A., Halper F., Kaufman M. 2013. Big Data for Dummies. Hoboken New
Jersey. John Wiley & Sons Inc.

50 2013 NATIONAL IT SHOWCASE PROCEEDINGS
2013

National BDPA Technology Conference

IT SHOWCASE PROCEEDINGS

TERRAIN CLASSIFICATION
JAZZMINE BESS
Florida A&M University
Tallahassee, FL
INTRODUCTION
"The science of today is the technology of tomorrow." --Edward Teller. Science and technology
are interdependent. Both are needed for the betterment of humanity. Exploration and innovation
would not exist with the codependency of technology and science. This year I had the
opportunity to experience firsthand the bond between the two. NASA sponsors, NASA USLI, a
rocket competition every year for universities and high schools nationwide. The event is a school
yearlong competition. This year is my second year on the team and I serve as Project Lead. The
ultimate goal of the program is to build a rocket that will fly a mile high and contains a scientific
payload. A scientific payload is any experiment that can be conducted in the rocket that will be
recoverable. I am a third year computer information systems student and I lead a team of
computer science and engineering students. My team and I developed a payload that
incorporates technology and science. The research started in August and the competition will
take place April 17-22. The experiment I’m conducting is a terrain classification experiment. The
experiment streams a video from the rocket and at set altitudes takes still frames from the video
and then sends the pictures to the computer to be analyzed. The computer will be able to
determine the terrain of the still shots. Once the computer has analyzed the pictures the computer
will be interacting with a mobile device on the ground that will receive pictures from the rocket
and the classifications for each picture.
SCIENCE VALUE AND UNIQUENESS
This experiment was spurred by recent success with the mission Curiosity. Curiosity is currently
testing and identifying the surfaces on Mars. Terrain exploration is important to understanding
other planets. While, robots are normally used in terrain classification experiments, such as the
rover Curiosity, my experiment uses pictures. My project allows for experimentation without the
additional cost of using robotics. Also, the project contains an app which interacts with a
computer a mile high. Integrating mobile technology with rocketry is a fairly new concept. The
app will be designed to read data in real time. This experiment challenges the question how else
mobile technology can influence rocketry.
HARDWARE COMPONENTS
The camera that I chose was the HackHD camera. This camera was chosen because of the quality
of the pictures. The resolution on the camera is 1098 HD, the frame rate is 30 frames per second,
and allows for composite video. The camera also comes with a 2 gigabyte microSD card. The

51
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings
2013 BDPA IT Showcase Proceedings

Más contenido relacionado

La actualidad más candente

Ethics for the paralegal
Ethics for the paralegalEthics for the paralegal
Ethics for the paralegalLauren Doucette
 
Educ 9701 digital natives
Educ 9701 digital nativesEduc 9701 digital natives
Educ 9701 digital nativesnata0014
 
Second Life Presentation
Second Life PresentationSecond Life Presentation
Second Life PresentationAline Click
 
Lareen Newman, 'The overlooked impact of basic reading and education leve ls ...
Lareen Newman, 'The overlooked impact of basic reading and education leve ls ...Lareen Newman, 'The overlooked impact of basic reading and education leve ls ...
Lareen Newman, 'The overlooked impact of basic reading and education leve ls ...Agnes Gulyas
 
Introduction and e research time line review
Introduction and e research time line reviewIntroduction and e research time line review
Introduction and e research time line reviewShishirAryal1
 
Aceds 2015 wie nycpa final oct panel slides
Aceds 2015 wie nycpa   final oct panel slidesAceds 2015 wie nycpa   final oct panel slides
Aceds 2015 wie nycpa final oct panel slidesJoe Bartolo
 
Cyberethics accompanying text
Cyberethics accompanying text Cyberethics accompanying text
Cyberethics accompanying text Geoffrey Lowe
 
1 s2.0-s074756321000004 x-main
1 s2.0-s074756321000004 x-main1 s2.0-s074756321000004 x-main
1 s2.0-s074756321000004 x-mainboonkum2524
 
State Departments of Transportation: Social Media Usage in a Broadview
State Departments of Transportation: Social Media Usage in a BroadviewState Departments of Transportation: Social Media Usage in a Broadview
State Departments of Transportation: Social Media Usage in a BroadviewLloyd Brown
 
Disability & Digital Inclusion: New Directions in Transforming Inequalities'
Disability & Digital Inclusion: New Directions in Transforming Inequalities'Disability & Digital Inclusion: New Directions in Transforming Inequalities'
Disability & Digital Inclusion: New Directions in Transforming Inequalities'University of Sydney
 
Digital divide
Digital divideDigital divide
Digital dividenatsw7
 
Closing the Divide- Sheila Dugan
Closing the Divide- Sheila DuganClosing the Divide- Sheila Dugan
Closing the Divide- Sheila DuganCode for America
 
DREAM+ CCRE 21 sept 06
DREAM+ CCRE 21 sept 06DREAM+ CCRE 21 sept 06
DREAM+ CCRE 21 sept 06DREAM
 

La actualidad más candente (19)

Ethics for the paralegal
Ethics for the paralegalEthics for the paralegal
Ethics for the paralegal
 
Educ 9701 digital natives
Educ 9701 digital nativesEduc 9701 digital natives
Educ 9701 digital natives
 
The New Digital Divide
The New Digital DivideThe New Digital Divide
The New Digital Divide
 
Second Life Presentation
Second Life PresentationSecond Life Presentation
Second Life Presentation
 
Lareen Newman, 'The overlooked impact of basic reading and education leve ls ...
Lareen Newman, 'The overlooked impact of basic reading and education leve ls ...Lareen Newman, 'The overlooked impact of basic reading and education leve ls ...
Lareen Newman, 'The overlooked impact of basic reading and education leve ls ...
 
Impact of Technology of our Work and Family Lives
Impact of Technology of our Work and Family LivesImpact of Technology of our Work and Family Lives
Impact of Technology of our Work and Family Lives
 
Introduction and e research time line review
Introduction and e research time line reviewIntroduction and e research time line review
Introduction and e research time line review
 
Aceds 2015 wie nycpa final oct panel slides
Aceds 2015 wie nycpa   final oct panel slidesAceds 2015 wie nycpa   final oct panel slides
Aceds 2015 wie nycpa final oct panel slides
 
Cyberethics accompanying text
Cyberethics accompanying text Cyberethics accompanying text
Cyberethics accompanying text
 
1 s2.0-s074756321000004 x-main
1 s2.0-s074756321000004 x-main1 s2.0-s074756321000004 x-main
1 s2.0-s074756321000004 x-main
 
Role of ICT in Education
Role of ICT in EducationRole of ICT in Education
Role of ICT in Education
 
State Departments of Transportation: Social Media Usage in a Broadview
State Departments of Transportation: Social Media Usage in a BroadviewState Departments of Transportation: Social Media Usage in a Broadview
State Departments of Transportation: Social Media Usage in a Broadview
 
Disability & Digital Inclusion: New Directions in Transforming Inequalities'
Disability & Digital Inclusion: New Directions in Transforming Inequalities'Disability & Digital Inclusion: New Directions in Transforming Inequalities'
Disability & Digital Inclusion: New Directions in Transforming Inequalities'
 
Digital divide
Digital divideDigital divide
Digital divide
 
BYOD for Educators
BYOD for EducatorsBYOD for Educators
BYOD for Educators
 
Closing the Divide- Sheila Dugan
Closing the Divide- Sheila DuganClosing the Divide- Sheila Dugan
Closing the Divide- Sheila Dugan
 
DREAM+ CCRE 21 sept 06
DREAM+ CCRE 21 sept 06DREAM+ CCRE 21 sept 06
DREAM+ CCRE 21 sept 06
 
STI Summit 2011 - Intro
STI Summit 2011 - IntroSTI Summit 2011 - Intro
STI Summit 2011 - Intro
 
Student 11494821
Student 11494821Student 11494821
Student 11494821
 

Similar a 2013 BDPA IT Showcase Proceedings

Data Visualization
Data Visualization Data Visualization
Data Visualization Madelyn Cox
 
VSU OLA Project
VSU OLA ProjectVSU OLA Project
VSU OLA ProjectT. White
 
Big data privacy issues in public social media
Big data privacy issues in public social mediaBig data privacy issues in public social media
Big data privacy issues in public social mediaSupriya Radhakrishna
 
Assignment 2 Part 1 Slideshare link submission.pdf
Assignment 2 Part 1 Slideshare link submission.pdfAssignment 2 Part 1 Slideshare link submission.pdf
Assignment 2 Part 1 Slideshare link submission.pdfkarandhar10
 
Special Session on “Applications of Artificial Intelligence and IoT in Comput...
Special Session on “Applications of Artificial Intelligence and IoT in Comput...Special Session on “Applications of Artificial Intelligence and IoT in Comput...
Special Session on “Applications of Artificial Intelligence and IoT in Comput...Christo Ananth
 
Solutions to digital inequality david weddle
Solutions to digital inequality david weddleSolutions to digital inequality david weddle
Solutions to digital inequality david weddledavidweddle
 
K tech santa clara 20131114 v1
K tech santa clara 20131114 v1K tech santa clara 20131114 v1
K tech santa clara 20131114 v1ISSIP
 
Loveland 090731 Naep Tech Lit Outreach
Loveland 090731 Naep Tech Lit OutreachLoveland 090731 Naep Tech Lit Outreach
Loveland 090731 Naep Tech Lit OutreachIntro Engineering
 
Short CfP #DISC2016
Short CfP #DISC2016Short CfP #DISC2016
Short CfP #DISC2016Han Woo PARK
 

Similar a 2013 BDPA IT Showcase Proceedings (20)

Newsletter_CICS_Spring2011
Newsletter_CICS_Spring2011Newsletter_CICS_Spring2011
Newsletter_CICS_Spring2011
 
Call for Presenters: BDPA IT Showcase
Call for Presenters: BDPA IT ShowcaseCall for Presenters: BDPA IT Showcase
Call for Presenters: BDPA IT Showcase
 
Overview - BDPA SITES Program
Overview - BDPA SITES ProgramOverview - BDPA SITES Program
Overview - BDPA SITES Program
 
Overview: BDPA Washington DC Chapter (2013)
Overview: BDPA Washington DC Chapter (2013)Overview: BDPA Washington DC Chapter (2013)
Overview: BDPA Washington DC Chapter (2013)
 
Data Visualization
Data Visualization Data Visualization
Data Visualization
 
VSU OLA Project
VSU OLA ProjectVSU OLA Project
VSU OLA Project
 
SET Presentation Jan 2009
SET Presentation Jan 2009SET Presentation Jan 2009
SET Presentation Jan 2009
 
Big data privacy issues in public social media
Big data privacy issues in public social mediaBig data privacy issues in public social media
Big data privacy issues in public social media
 
What I Learned @ #G2e 2010
What I Learned @ #G2e 2010What I Learned @ #G2e 2010
What I Learned @ #G2e 2010
 
Assignment 2 Part 1 Slideshare link submission.pdf
Assignment 2 Part 1 Slideshare link submission.pdfAssignment 2 Part 1 Slideshare link submission.pdf
Assignment 2 Part 1 Slideshare link submission.pdf
 
Special Session on “Applications of Artificial Intelligence and IoT in Comput...
Special Session on “Applications of Artificial Intelligence and IoT in Comput...Special Session on “Applications of Artificial Intelligence and IoT in Comput...
Special Session on “Applications of Artificial Intelligence and IoT in Comput...
 
Web2 Seminar
Web2 SeminarWeb2 Seminar
Web2 Seminar
 
Solutions to digital inequality david weddle
Solutions to digital inequality david weddleSolutions to digital inequality david weddle
Solutions to digital inequality david weddle
 
Cv2013 mlak
Cv2013 mlakCv2013 mlak
Cv2013 mlak
 
Cv2013 mlak
Cv2013 mlakCv2013 mlak
Cv2013 mlak
 
K tech santa clara 20131114 v1
K tech santa clara 20131114 v1K tech santa clara 20131114 v1
K tech santa clara 20131114 v1
 
Corporate Sponsorhip Presentation: Luxottica Retail
Corporate Sponsorhip Presentation: Luxottica RetailCorporate Sponsorhip Presentation: Luxottica Retail
Corporate Sponsorhip Presentation: Luxottica Retail
 
Corporate Sales Presentation: BDPA & ConnXus
Corporate Sales Presentation: BDPA & ConnXusCorporate Sales Presentation: BDPA & ConnXus
Corporate Sales Presentation: BDPA & ConnXus
 
Loveland 090731 Naep Tech Lit Outreach
Loveland 090731 Naep Tech Lit OutreachLoveland 090731 Naep Tech Lit Outreach
Loveland 090731 Naep Tech Lit Outreach
 
Short CfP #DISC2016
Short CfP #DISC2016Short CfP #DISC2016
Short CfP #DISC2016
 

Más de BDPA Education and Technology Foundation

Más de BDPA Education and Technology Foundation (20)

Oracle Scholarship for BDPA Students
Oracle Scholarship for BDPA StudentsOracle Scholarship for BDPA Students
Oracle Scholarship for BDPA Students
 
BDPA and College Students
BDPA and College StudentsBDPA and College Students
BDPA and College Students
 
Eli Lilly Scholarship for BDPA Students (2018)
Eli Lilly Scholarship for BDPA Students (2018)Eli Lilly Scholarship for BDPA Students (2018)
Eli Lilly Scholarship for BDPA Students (2018)
 
Johnson & Johnson Scholarship (2018)
Johnson & Johnson Scholarship (2018)Johnson & Johnson Scholarship (2018)
Johnson & Johnson Scholarship (2018)
 
flyer-BDPAConnect Virtual Career Fair
flyer-BDPAConnect Virtual Career Fair flyer-BDPAConnect Virtual Career Fair
flyer-BDPAConnect Virtual Career Fair
 
Nomination form * BDPA Cincinnati (2017)
Nomination form * BDPA Cincinnati (2017)Nomination form * BDPA Cincinnati (2017)
Nomination form * BDPA Cincinnati (2017)
 
Newsletter: BDPA Memphis (June 2017)
Newsletter: BDPA Memphis (June 2017) Newsletter: BDPA Memphis (June 2017)
Newsletter: BDPA Memphis (June 2017)
 
Newsletter: BDPA Washington DC (May 2017)
Newsletter: BDPA Washington DC (May 2017) Newsletter: BDPA Washington DC (May 2017)
Newsletter: BDPA Washington DC (May 2017)
 
Oracle Scholarship for BDPA Students
Oracle Scholarship for BDPA StudentsOracle Scholarship for BDPA Students
Oracle Scholarship for BDPA Students
 
Wanda Everett BDPA Scholarship
Wanda Everett BDPA ScholarshipWanda Everett BDPA Scholarship
Wanda Everett BDPA Scholarship
 
BDPA Technology Conference Flyer (2017)
BDPA Technology Conference Flyer (2017)BDPA Technology Conference Flyer (2017)
BDPA Technology Conference Flyer (2017)
 
2017 BDPA Individual PACEsetter Awards Program
2017 BDPA Individual PACEsetter Awards Program2017 BDPA Individual PACEsetter Awards Program
2017 BDPA Individual PACEsetter Awards Program
 
Top Companies for Blacks in Technology `
Top Companies for Blacks in Technology `Top Companies for Blacks in Technology `
Top Companies for Blacks in Technology `
 
flyer-BDPAConnect Virtual Career Fair
flyer-BDPAConnect Virtual Career Fairflyer-BDPAConnect Virtual Career Fair
flyer-BDPAConnect Virtual Career Fair
 
BDPA Cincinnati Computer Camp Orientation (2017)
BDPA Cincinnati Computer Camp Orientation (2017)BDPA Cincinnati Computer Camp Orientation (2017)
BDPA Cincinnati Computer Camp Orientation (2017)
 
BDPA Connect Virtual Career Fair
BDPA Connect Virtual Career FairBDPA Connect Virtual Career Fair
BDPA Connect Virtual Career Fair
 
Overview-SITES_Triangle-2016
Overview-SITES_Triangle-2016Overview-SITES_Triangle-2016
Overview-SITES_Triangle-2016
 
National BDPA Mobile Application Showcase
National BDPA Mobile Application ShowcaseNational BDPA Mobile Application Showcase
National BDPA Mobile Application Showcase
 
ITSMF Educational Scholarship
ITSMF Educational ScholarshipITSMF Educational Scholarship
ITSMF Educational Scholarship
 
2016 Golf Classic Trifold
2016 Golf Classic Trifold2016 Golf Classic Trifold
2016 Golf Classic Trifold
 

Último

Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parentsnavabharathschool99
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
 
Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationRosabel UA
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxlancelewisportillo
 

Último (20)

Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parents
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
 
Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translation
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
 

2013 BDPA IT Showcase Proceedings

  • 1. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS Washington, DC | August 14-17, 2013 Washington Hilton
  • 2. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS Washington, DC | August 14-17, 2013 Edited by: Jesse L. Bemley, Ph.D © Black Data Processing Associates, August 14, 2013 2 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 3. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS CONTENTS A LETTER FROM DR. JESSE L. BEMLEY ............................................................................................. 4 THE STORY OF THE IT SHOWCASE ................................................................................................... 5 2012 IT SHOWCASE WINNERS ............................................................................................................. 6 MEET THE IT SHOWCASE JUDGES .................................................................................................... 8 IT SHOWCASE PRESENTATION SCHEDULE AT A GLANCE ....................................................11 WEDNESDAY, AUGUST 14 ........................................................................................................................11 THURSDAY, AUGUST 15 ...........................................................................................................................11 KEYNOTE SPEAKER ...............................................................................................................................12 HIGH SCHOOL PRESENTATIONS AT A GLANCE ........................................................................13 MEET THE HIGH SCHOOL STUDENT PARTICIPANTS ..............................................................14 MOTION SENSOR TECHNOLOGY IN VIDEO GAMING ............................................................17 FUZZY LOGIC ...........................................................................................................................................20 CYCLOTRONS: FROM THE LAB TO THE FIELD ............................................................................27 THE FUTURE OF AIR TRAVEL.............................................................................................................30 UNDERGRADUATE PRESENTATIONS AT A GLANCE ...............................................................34 MEET THE UNDERGRADUATE PARTICIPANTS ..........................................................................35 NEURAL NETWORKS .............................................................................................................................38 SECURING BIG DATA ...........................................................................................................................45 TERRAIN CLASSIFICATION ................................................................................................................51 THE EXPANSION OF CYBER SECURITY...........................................................................................58 THE TRUTH ABOUT BIOARTIFICIAL ORGANS............................................................................69 MODERN WEARABLE TECHNOLOGIES .........................................................................................76 CRASH COURSE CLOUD COMPUTING ...........................................................................................81 3
  • 4. A LETTER FROM DR. JESSE L. BEMLEY Welcome and thank you for participating in the 11th Annual National BDPA IT Showcase. This will be an exciting three days!!!! These proceedings contain papers from 11 students, IT Showcase history, a list of past winners and the photos of winners since 2008. The Presentations/Projects were judged earlier this morning in two categories, college (undergraduate) and high school. Three awards will be made in each category for 1st, 2nd, and 3rd place. Certificates to all participants are to be presented at the end of the IT Showcase session on Thursday afternoon. Awards for first, second, and third place winners are presented at the Awards Banquet on Saturday. During the first decade, 2003-2012, of the IT Showcase, there were 42 undergraduate papers and 66 high school papers presented. The students represented 14 states, 24 high schools, 12 universities and two community colleges. Their papers covered database technology, web design technology, wired and wireless communication technology, IT security, data mining, soft computing, high performance computing, cloud computing, virtual technologies, nanotechnology, robotics, operating systems, IT certification, social media, Bid Data, health information systems, social media, etc. Again, thanks for joining us at the BDPA IT Showcase! Jesse L. Bemley, Ph.D. BDPA IT Showcase Conference Manager 4 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 5. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS THE STORY OF THE IT SHOWCASE The idea that led to the creation of the IT Showcase has taken several twists and turns over the years. As far back as the late 1980s, a UNISYS communications engineer in Philadelphia talked about a BDPA computer science fair. The computer science fair would be patterned after the traditional high school science fair. The idea was put on the back burner because of the all-consuming activities of the pilot HSCC which was held in Atlanta in 1986. During the New Orleans Convention in 1987, the Artificial Intelligence Model of a Black Teenager was presented by three teenage girls. The model was an expert system developed from the AI language Prolog. The student presentation was a component of Dr. Jesse Bemley’s workshop. Bemley’s National Conference workshops included high school students from 1989 – 1992. The students’ names were not a part of the conference program. Instead the workshops had separate programs as handouts. In 1993 Margaret Jennings suggested that Bemley’s students participate in the Youth Conference as the High School Computer Science Fair at the BDPA National Conference in Kansas City, MO. For the very first time students names were published in the official conference Youth Activities Guide. Five high school students presented papers. Their research areas included expert systems, logic puzzles, neural networks, and fractals. The activity continued until the 1997 conference in Houston, TX. There were no further computer science fairs. The national conference co-coordinator did not want students making presentations to Youth Conference participants, only adult presenters. In 2001 Dwight Huggett, then HSCC coordinator, proposed an IT Showcase where the projects of college and high school students would be highlighted. The effort did not get off the ground. There was a subsequent attempt in 2002. Again, the resources were not available. In 2003, BDPA President Elect Wayne Hicks asked Bemley to accept the challenge, which he did. Hicks wanted an event that would keep the HSCC alumni moving academically toward the Ph.D. Bemley modified the requirements for participation to include: a 10 page paper in a leading edge topic, a 15 minute Powerpoint presentation, a 3ft by 4ft poster which graphically depicts the paper, a one page bio with photo, and a trip report due a week after the conference. The 2003 BDPA National Conference in Philadelphia hosted the first IT Showcase. Washington, DC is hosting the 2013 BDPA National Technology Conference and the 11h BDPA IT Showcase. The Cincinnati Chapter hosted the first regional IT Showcases in 2005 and another in 2011. There have been several unsuccessful attempts at regional/local IT Showcases in subsequent years. The BDPA-DC Regional Technology Conference in Washington, DC in 2007, 2008, 2009, 2010, 2011 and 2012 held very successful IT Showcases. The Northern Delaware Regional IT Showcase was held on May 15, 2010. It can be used as a model for other regional IT Showcases who don’t wish to use the traditional IT showcase paradigm. Greater Columbia Chapter hosted the Washington, DC Chapter in 2012at a very successful Southeast Regional IT Showcase. 5
  • 6. 2012 IT SHOWCASE WINNERS First Place 2012 High School Winner TYRA NATORI FOULKS Irmo High School Columbia, SC Second Place 2012 High School Winner WESLEY WALKER Groveport Madison High School and Fairfield Career Center Groveport, OH Third Place 2012 High School Winner BRANDI TAYLOR Irmo High School Columbia, SC 6 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 7. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS First Place 2012 Undergraduate Winner MICHAEL BIJOU Bowie State University Second Place 2012 Undergraduate Winner BRYAN BEMLEY Bowie State University Third Place 2012 Undergraduate Winner ADWAIT WALIMBE University of Minnesota 7
  • 8. MEET THE IT SHOWCASE JUDGES Mr. Curtis Roberts is the Enterprise Architect for the National Satellite Data and Information Service whose mission, the acquisition and dissemination of environmental data from satellites, and the development of data products used worldwide, and is an office of the National Oceanic Chief Judge CURTIS ROBERTS NOAA National Satellite Data and Information Service and Atmospheric Administration. Mr. Roberts comes to the National Oceanic and Atmospheric Administration from the Department of Housing and Urban Development (HUD) Washington, District of Columbia as a Senior Information Technology Specialist and Project Manager with the Office of Systems Integration and Efficiency where his responsibility was to support the mission to increase homeownership, support community development and increase access to affordable housing. Mr. Roberts, a 15 year US Navy veteran entered the Federal Civilian Service with HUD in 1989. Previously a Department of Defense civilian contractor, Mr. Roberts preformed Information Technology technical services supporting the Office of Secretary of Defense, Vice Chief of the Army, and Aviation Engineering Logistical Support for US Navy, and Naval Air Systems Command. Understanding the need for continued education in Science, Technology, Engineering and Math, Mr. Roberts holds an advisory position with the Hampton Roads Chapter of BDPA, and an advisory relationship with non-profit community based educational organizations whose interest teaches STEM topics to students from Grades 4 through 12. 8 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 9. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS Fabianna has over 14 years’ experience in the IT industry. She began her IT career in IBM Global Services where she focused on developing and supporting ecommerce, Java, AS/400 and Microsoft applications. In 2000, she began her career at Monsanto as a contractor on the Web Engineering Team and transitioned to a Monsanto employee in 2001. During her career at Monsanto, she has held various roles including Project Manager, Web Server Security and Implementation engineer, Windows Security Administrator, and Web, Database, and Portals Advanced Support Lead. She currently leads the Middleware Services Advanced Support Teams which support emerging platforms such as Business Objects, Informatica, BPM (Lombardi), GeoSpatial. In addition to Fabianna receiving her B.S. and M.S. in Judge Computer Science from Clark Atlanta University with a FABIANA SOLARI Concentration in Data Warehousing, she has maintained a focus on leadership and team development and training for youth, college students, and professionals. Over the past 8 years, she has focused on her passion of Middleware Services Advanced Support Teams Monsanto helping others leverage their talents and strengths to achieve their goals. This has been accomplished via coaching and mentoring others as well as facilitating sessions such as Strength Finders, DiSC, SOAR, iSpeak, Asset Based Thinking, Transformational Conversations, and Effective Presentation and Communication Skills. At Monsanto, Fabianna is member of the African Americans in Monsanto (AAIM) Leadership and Retention Committee, GI Leadership Development Committee, Monsanto BDPA Committee, Women’s Network, Women in IT, Young Professionals, and Women’s Leadership Giving Initiative. She also serves as one of two Monsanto liaisons for the St. Louis Chapter of the National Black MBA Association. Fabianna enjoys spending time with her nieces, nephews, godchildren and youth ministry teens. She also enjoys traveling to a new destination each year, cruising, vacationing in the Caribbean, and trying new recipes. 9
  • 10. Dr. James M. Turner leads NOAA's international scientific and environmental efforts associated with the global oceans, atmosphere, and space. He serves as the principal advisor to the Under Secretary and Administrator on international policy issues, represents NOAA and the United States with foreign governments and international fora, establishes policies, guidelines, and procedures for NOAA’s international programs, and provides support and coordination to NOAA’s lines offices. These efforts help us to better understand, predict, and take steps to respond to changes in the Earth's environment, conserve and manage coastal and marine resources, protect life and property, and to provide decision makers with reliable scientific information. Judge Dr. Turner comes to NOAA from the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) where he served as the Acting Director (September 2007 to September 2008) and Deputy Director (from April to September 2007). NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology. DR. JAMES M. TURNER NOAA Director of the Office of International Affairs Prior to joining NIST, Dr. Turner served as the Assistant Deputy Administrator for Nuclear Risk Reduction in the Department of Energy’s National Nuclear Security Administration. In that position he was responsible for major projects in Russia to permanently shut down their last three weapons-grade plutonium-production reactors. He also worked with foreign governments and international agencies to reduce the consequences of nuclear accidents by strengthening their capability to respond to nuclear emergencies. Turner has also held several senior management posts at DOE concerned with laboratory oversight and with nuclear safety and the safeguarding of nuclear weapons both here and abroad. He holds degrees in Physics from the Massachusetts Institute of Technology (Ph.D.) and Johns Hopkins University (B.A.), and taught for five years as an Associate Professor of Physics and Engineering at Morehouse College. 10 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 11. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS IT SHOWCASE PRESENTATION SCHEDULE AT A GLANCE WEDNESDAY, AUGUST 14 Practice Session and Poster Setup After Opening Session THURSDAY, AUGUST 15 Poster Presentation Judging 8:30 a.m. - 10:00 a.m. (Closed to Public) Welcome and Introductions Dr. Jesse Bemley, IT Showcase Manager 10:00 a.m. - 10:30 a.m. Keynote Speaker Address Shaneece Davis, IT Specialist US Department of Health and human Morning Presentations 10:30 a.m. - 12:00 p.m. Lunch 12:00 p.m. - 1:25 p.m. Afternoon Presentations 1:30pm – 3:00pm Award Presentation 3:30 p.m. - 4:30 p.m. Closing Remarks Monique Berry, National BDPA President 11
  • 12. KEYNOTE SPEAKER Shaneece Davis is an IT Specialist at the Centers for Medicare and Medicaid Services with a background in project management and a passion for Information Technology. Shaneece has most recently earned her Masters of Science degree of Information Management from the University of Maryland at College Park and also holds a Bachelors of Science degree in Computer Information Systems from North Carolina Central University. SHANEECE DAVIS Shaneece has been a past participant in the BDPA Annual IT Showcase in 2010 and 2011 received 2nd place in the competition for her research titled "Increasing Active Learning among Students: NCCU's Introduction of Virtual Computing Lab to grades K-12." HHS IT Specialist The Centers for Medicare & Medicaid Services Shaneece has a special interest in pursuing different research endeavors related to Health care, Information Science, and IT. She has been a student member of the BDPA since 2010 and has mostly recently joined the GoldenKey Honour Society and the Special Library Association (SLA) as of this year. In her current position as an IT Special Shaneece assists in the oversight of various government contracts with duties related to project management. Being a recent graduate and fairly new to the workforce Shaneece hopes to be make a huge impact on the IT field someday. Connect with Shaneece Email: shaneecedavis@gmail.com Facebook: facebook.com/ShaneeceSDavis LinkedIn: linkedin.com/pub/shaneece-davis/33/953/1b1 12 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 13. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS HIGH SCHOOL PRESENTATIONS AT A GLANCE C YBER S ECURITY AND THE T YPE OF H ACKERS I NVOLVED Zinquarn Wright N ETWORK S ECURITY E NGINEER : WHAT IS I T ? Anthony Lawson C YBER S ECURITY FOR P UBLIC WORKS AND U TILITIES Cristal Sandoval L AW AND F ORENSIC S CIENCE : THE TECHNOLOGICAL A SPECTS Abdull Ali “P ENETRATING THE C LOUD ”: E XPLORATION OF C LOUD C OMPUTING Brandi Taylor MOTION S ENSOR TECHNOLOGY IN V IDEO G AMING Julian Anderson F UZZY L OGIC Tevon Eversley C YCLOTRONS : F ROM THE L AB TO THE F IELD Ian Crowder THE F UTURE OF A IR T RAVEL Jared Sherrod 13
  • 14. MEET THE HIGH SCHOOL STUDENT PARTICIPANTS ZINQUARN WRIGHT McKinley Technology High School Washington, DC Zinquarn Wright is a junior at McKinley Technology High School. He will major in music production in college. His presentation is “Cyber Security and The Type of Hackers Involved”. ANTHONY LAWSON Academy for Ideal Education Washington, DC Anthony Lawson is a junior at Academy for Ideal Education. He expects to major in architecture upon matriculation at the university. His presentation is “Network Security Engineer: What is it?” CRISTAL SANDOVAL Thurgood Marshall Public Charter School Washington, DC Cristal Sandoval is a junior at Thurgood Marshall High School. She has a keen interest in Science and Math. Her presentation is “Cyber Security for Public Works and Utilities”. 14 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 15. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS ABDULL ALI Frank Ballou High School Washington, DC Abdull Ali is a junior at Ballou High School. His career aspirations include veterinary and IT fields. His presentation is “Law And Forensic Science: The Technological Aspects”. BRANDI TAYLOR Irmo High School Columbia, SC Brandi Taylor is currently a 10th grade student at Irmo High School. Brandi’s current career plan after high school is to become a Nurse. Her ultimate motto is “I can do anything if I put my mind to it.” Her presentation is “Penetrating the Cloud”: Exploration of Cloud computing”. JULIAN ANDERSON Oak Park River Forest High School Oak Park, IL Julian Anderson was born in Oak Park Illinois on September 30th 1994. At the age of 2, he became the youngest child to go to the Supreme Court. He is fascinated at how technology impacts the lives of everyone every day. He will be attending the University of Nebraska in the fall. He doesn’t have a major yet. However, he hopes to start a career where technology is heavily involved. His presentation is “Motion Sensor Technology in Video Gaming”. 15
  • 16. TEVON EVERSLEY Bedford Academy High School Brooklyn, NY Tevon Eversley is on track to receiving an advanced Regents Diploma from Bedford Academy High School. He would like to attend Massachusetts Institute of Technology to study mechanical engineering as well as architecture. His presentation is “Fuzzy Logic”. IAN CROWDER Oak Park River Forest High School Chicago, IL Ian has pursued a galvanized interest in physics by taking part in several scientific symposiums in which he presented his research on particle, applied, and atomic physics. Ian has recently graduated from Oak Park River Forest High School and, this fall he will be attending Iowa State University. His presentation is “Cyclotrons: From the Lab to The Field”. JARED SHERROD Delaware Academy of Public Safety and Security Newark, DE Jared D. Sherrod is a fourteen year old freshman at Delaware Academy of Public Safety and Security. Jared is in his first year in high school where he maintains a 3.0 grade point average and is trying to achieve his goal to graduate as an Air Force Academy Cadet. His presentation is “The Future of Air Travel”. 16 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 17. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS MOTION SENSOR TECHNOLOGY IN VIDEO GAMING The Wave of The Future JULIAN ANDERSON Oak Park River Forest High School Oak Park, IL Motion Sensor Technology, has been revolutionizing the way we live for years. Motion Sensor technology is not only changing the video game industry, but will be a key component in the way technology is in the future. Motion sensors have been used mainly in automotive and specialized industrial applications. New applications and broader market opportunities for motion sensors have emerged since 2006 when the devices were first utilized in the Nintendo Wii console. The market for motion sensors has now grown to include a much broader set of applications including sporting equipment, home appliances, industrial solutions, security monitoring systems, medical rehabilitation and clinical devices. Motion Sensing Technology is drastically changing the video gaming industry. The first motion sensor gaming technology was the Power Glove it was originally released in 1989 by Nintendo. The flex sensors in the Power Glove were Power glove carbon-based ink on plastic. Bending the flex sensors caused the carbon to compress, decreasing the resistance. The sensors in the DataGlove were based on optical fibers that were scratched near the bending joint that caused it to transmit less light when bent, an innovation developed by Young L. Harvill of VPL Research. There were two ultrasonic speakers (transmitters) in the glove and three ultrasonic microphones (receivers) around the TV monitor. The ultrasonic speakers take turns transmitting a short burst (a few pulses) of 40 kHz sound and the system measures the time it takes for the sound to reach the microphones. A triangulation calculation is performed to determine the X, Y, Z location of each of the two speakers, which specifies the yaw and roll of the hand. The only dimension it cannot calculate is the pitch of the hand, since the hand can pitch without moving the location of the two ultrasonic speakers. The Power Glove was based on the patented technology of the VPL Dataglove, but with many modifications that allow it to be used with slow hardware and sold at an affordable price. The Dataglove can detect yaw, pitch, and roll uses fiber optic sensors to detect finger flexure and has a resolution of 256 positions (8 bits) per finger for four fingers (the little finger is not measured to save money, for it usually follows the movement of the ring finger), the Power Glove could only detect roll, and uses sensors coated with conductive ink yielding a resolution of four positions (2 17
  • 18. bits) per finger for four fingers. This allowed the Power Glove to store all the finger flexure information in a single measured to save money, for it usually follows the movement of the ring finger), the Power Glove could only detect roll, and uses sensors coated with conductive ink yielding a resolution of four positions (2 bits) per finger for four fingers. This allowed the Power Glove to store all the finger flexure information in a single byte. However, it appears that the fingers actually feed an analog signal to the microprocessor on the Power Glove. The microprocessor converts the analog signal into two bits per finger. The technology has been greatly advanced by Nintendo, Sony and Microsoft since the Power Glove. The current technology in the Wii Remote’s built-in optical sensor acts like a camera used to locate the sensor bar’s infrared LEDs in its point of view. By plotting where two spots of light fall (one from each end of the sensor bar), the Wii Console is able to determine where the remote is pointing in relation to the screen. All sounds very clever – and it is – but it does have one obvious drawback. A user, or indeed multiple users, would need to remain within the sensor bar’s point of view in order for the system to work. In addition to knowing where a user is pointing, the Wii Remote can also calculate how it’s being moved. This is done with the use of accelerometers – tiny chips that feature a piece of silicon anchored at one end and allowed to move at the other between an electric field created by a pair of capacitors. Accelerating the remote in one direction causes the silicon to move and disrupt the electronic field. That change is translated to motion, and the Wii Remote can measure acceleration on three axes – allowing for the ability to perform a variety of gestures such as moving side to side, twisting, and pushing and pulling. On top of all this, the data captured by the optical sensor and accelerometers needs to be sent back to the Wii console without wires. In order to achieve that, the Wii remote contains a built-in Bluetooth chip that allows for two-way communication with the console. Xbox 360 Kinect This one example of the strides that have been made in the development of Motion Sensing technology. Motion Sensing Technology is being advanced to include 3D programming and a variety of variations of sensors to improve and advance gaming. This technology not only will improve gaming but will also have an impact on the way society moves forward in the future. One of the products, the SMotion unit, contains a set of LEDs and is worn on the player’s belt. A regular web camera tracks the beams of those LEDs and processes the data using algorithmic 18 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 19. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS software to determine the player’s body position in real time. It is reportedly ten times more accurate and responds ten times faster than camera-only systems such as Kinect. Another device, that can be used instead of SMotion, is the PMotion. It’s a platform that the player stands on, which detects shifts in their center of gravity. Motion Sensors continue to be used by the auto industry to assist with parking cars, security devices, digital cameras, and cell phones. Motion Sensing Technology is developing at a rapid pace. This form of technology will soon be used in our everyday lives to perform tasks that we normally do. REFERENCES Lomberg, Jason. "Electronic Component News." The Future of Motion-sensing Technology. Electronic Component NEws, 14 Jan. 2010. Web. 16 Apr. 2013. Website Romero, Joshua J. "How Do Motion-Sensing Video Game Controllers Work?" How Do MotionSensing Video Game Controllers Work? (2006): n. pag. » Scienceline. Arthur L. Carter Journalism Institute at New York University, 18 Dec. 2006. Web. 15 Apr. 2013. Journal Article Shen, Jessie. "Motion Sensor TechnologMotion Sensor Technology Moves towards Maturity with Related Markets to Hit US$63.8 BillionMoves towards Maturity with Related Markets to Hit US$63.8 Billion." Motion Sensor Technology Moves towards Maturity with Related Markets to Hit US$63.8 Billion. DIGITIMES Inc., 14 Oct. 2010. Web. 15 Apr. 2013. Talk, Niko. "How Motion Sensor Technology Is Revolutionizing Video Games." N.p., 01 Dec. 2010. Web. 8 Apr. 2013. Website "Video Games' Battle of Motion Sensor Technology." (2009): n. pag. Seeking Alpha, 3 June 2009. Web. 16 Apr. 2013. Journal Article 19
  • 20. FUZZY LOGIC TEVON EVERSLEY Bedford Academy High School Brooklyn, NY Early Pioneers Fuzzy logic is an approach to computing based on “degrees of truth” rather than the usual “true or false” (1 or 0). Fuzzy logic is a binary coding system with extreme cases of 0 and 1’s or as others may refer to it as extreme cases of truths (or “the state of matters” or “fact”). It also includes the various states of truth in between so that, for example, the result of a comparison between two things could be not “tall” or “short”. This has been a highly experimental technology since the day of it’s discovery by Dr. Lotfi Zadeh of the University of California at Berkley. Dr. Zadeh first stumbled upon this technology because he believed that it was possible to create an artificial intelligence that would compute solutions to problems as well as everyday conflicts based upon previous instances. The inventor of Fuzzy Logic Dr. Zadeh was an Iranian scientist, who was born during the year of 1921 in Bakou in Azerbaijan Republic. His father was also an Iranian and worked as a journalist, because of his father’s position they lived in Bakou. His mother was a Russian and she worked as a doctor. This is an image of Dr. Zadeh below: Shortly after Dr. Zadeh’s discovery of Fuzzy Logic he passed away. After this tragic event Fuzzy Logic was put on hold for some years that is until late Professor ValiAllah Tahani took an interest. After this event Fuzzy Logic then seemed to have a promising future once again as 20 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 21. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS research was then conducted in this area once again. This is an image of Professor ValiAllah Tahani below: In Depth View of Fuzzy Logic: As mentioned previously Fuzzy Logic is a computing system based on degrees of truth. This may sound fairly simply as in just entering the instances/information but it actually takes several years to complete and is fairly complicated. First off the artificial intelligence system has to go through a learning process in which it goes through a series of intense trials in which one would have to teach the system and differentiate the truth from what is not the truth. Once this process is conducted one would then have to put it through several test where it would have to put that information to work to see whether or not it chose’s the correct outcomes based upon the previous information put in. If it doesn’t choose the right outcome you must correct it and show it the way until it is able to choose the right outcome every time. Dr. Zadeh first proposed this idea during his work on the Fuzzy Logic Intelligence as well as natural language, this posse’s difficulty in bringing it to terms of absolute 0 and 1s. Complications with Fuzzy Logic: Fuzzy Logic may be a promising field with vast capabilities but it posse’s its fair share of difficulties as well. One of these problems is the fact that it isn’t reliable and the other is that it doesn’t make decisions with emotional implications. The decisions that we as humans make are often made based upon what we know as well as what we feel. Unlike artificial intelligence we don’t make decisions based solely upon what we know. Therefore this then puts a hindrance on the technology and puts it at a disadvantage. For this problem a unique solution would have to be made up so that all problems are addressed properly. Solution to Complications: My proposition to solving this problem is to implement a Failure Mode an Effective Analysis (FMEA) component into Fuzzy Logic. In theory this would correct any potential problems that may occur. I plan to attack the emotional implication problem by putting the Fuzzy Logic Technology to handle only situations that doesn’t deal with things that may involve morals and/or emotions, therefore the technology wouldn’t be at a disadvantage. In Depth View of Failure Mode and Effective Analysis: Failure Mode and Effective Analysis was one of the first systematic techniques for correcting failures within technologies. It was developed by reliability engineers in the 1950s to study problems that might arise from malfunctions of military systems. A FMEA is often the first step of a system reliability study. It involves reviewing as many components, assemblies, and subsystems as possible to identify failure modes, and their causes and effects. For each component, the failure modes and their resulting effects on the rest of the system are recorded in a specific FMEA worksheet. There are numerous variations of such worksheets. An FMEA is an inductive failure analysis and is a core. A successful FMEA activity helps to identify potential 21
  • 22. failure modes based on experience with similar products/instances processes or based on common physics of failure logic. Effects analysis refers to studying the consequences of those failures on different system levels. Functional analyses are needed as an input to determine correct failure modes. Failure probability can be reduced by understanding the failure mechanism and reducing or eliminating the (root) causes and failure mechanism that may lead to the failure (mode). It is therefore important to include the FMEA as it documents the causes of failure. To put it into lemans terms failure mode and effective analysis is supposed to be a corrective process in which the Artificial Intelligence corrects its mistakes based upon previous instances/information. Some of the benefits of this technology are: It provides a documented method for selecting a design with a high probability of successful operation and safety. A documented uniform method of assessing potential failure mechanisms, failure modes and their impact on system operation, resulting in a list of failure modes ranked according to the seriousness of their system impact and likelihood of occurrence. Early identification of single failure points (SFPS) and system interface problems, which may be critical to mission success and/or safety. They also provide a method of verifying that switching between redundant elements is not jeopardized by postulated single failures. An effective method for evaluating the effect of proposed changes to the design and/or operational procedures on mission success and safety. A basis for in-flight troubleshooting procedures and for locating performance monitoring and fault-detection devices. Criteria for early planning of tests. An Example of What a FMEA Chart Looks Like: 22 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 23. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS With the introduction of a Failure Mode and an Effective Analysis as well as Fuzzy Logic the two should be able to coexist with one another simultaneously. This effective analysis addition in theory would log all possible failures with fuzzy logic. In doing so this would eliminate any possible failures with the fuzzy logic system. Making it more stable as well as a valuable system. Fuzzy logic is a viable product, as it would help to elevate some pressure on the energy consumption, as it would automatically regulate uses. Such as the running time for light fixtures such as lamps and etc. Applications in Society/Early Uses: Some of the earliest and most common uses of the Fuzzy Logic technologies are found in relatively simple applications such as household appliances. These applications only scratch the surface of the potential of the Fuzzy Logic technology in complex mechatronic systems. Some of these household appliances include rice cockers as well as lighting systems. For instance with the rice cooker they implement the fuzzy logic technology to help the process of cooking rice become much easier than it already is. It does this through remembering/teaching it self how long on average it usually takes to make a certain type of rice. It will go through this process for the first couple times of use. After this in theory the rice cooker is supposed to shut off and on , on it’s own based upon previous data as it should know the run time based upon previous experiences/instances. Systems such as these are popular for the general public today as they sell in major retail stores such as Macy’s and etc. The cheapest these systems usually run for is about $120 dollars. This is an example to just show the capabilities of the Fuzzy Logic technologies and how ineffectively we are selves are using it. This is a image of a Fuzzy Logic Rice Cooker below: 23
  • 24. More in Depth View of the Rice Cooker: Fuzzy-logic Rice Cookers have computer chips in which are located inside of the rice cooker that direct their ability to make proper adjustments to cooking time as well as to temperature. Unlike basic rice cookers, which complete tasks in a single-minded, mechanical manner. The process for a fuzzy logic system is somewhat more complex. While fuzzy-logic rice cookers function under the same premise as basic models, their mathematical programming can deliver a slew of customized cooking options. The trick behind these capabilities is that the rice cooker can offer the ability to react, making precise fluctuations in cooking time and temperature depending upon the program selected. These may include different settings such as keeping-warm and quick-cooking cycles for the optimum cooking of rice varieties like sushi rice, porridge rice, mixed rice, white rice, sweet rice and brown rice. Some models also offer texture settings, allowing people to select hard or soft and sticky or wet rice. Based upon your selection it will go through its database in which previous cooking times and temperatures are stored, and then select the best combination for the type of rice you selected. After this is done you will receive a outcome based upon your selection. Renewable Energy: Renewable energy has been an ever-growing field in the past years. In fact this was a pressing concern during the presidential debate as the United States of America is one of top contributors to fossil fuel burning today. Another key point is that we are behind in the Renewable energy field. The main problem is that many people just don’t see it as a real viable resource. They say why do today what we can do tomorrow but the real question is why do tomorrow what you can do today. This saying influences many to make foolish decisions and put off the development and advancements in the renewable energy field. This stigma on energy has changed as seen in the chart below that investments in renewable energy have more than doubled in the projections seen below: 24 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 25. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS As seen in the above image there is an ever-growing awareness in the renewable energy field, more people are beginning to see its importance. As shown in the charts below the prices for gasoline has not only increased but it is expected to continue increasing. This only emphasizes the need for advancements to be made in the renewable energy field as cost will become a pressing matter for future generations. The main reason for the images above is to emphasize on the point of renewable energy. With the introduction of the Fuzzy Logic System simultaneously with the Failure Mode and Effective Analysis technology this will help to cut down on the United States consumption of energy. In 25
  • 26. theory with the implementation of the Fuzzy artificial intelligence it’s suppose to factor in the run time/usage office spaces and etc., then from there it will put together a schedule for exactly what times power should be redirected to that room. During the times when the room is said to be offline all power to that sector would cut off automatically on its own in order to conserve energy. Real Time Application Example: A real time example of this would be street lamps as these appliances consume a great deal of energy on a daily basis. First off the Fuzzy Logic System would evaluate the times in which the area is most populated. From there it will create a run time schedule for the lamppost. It will also take into account the time the sun rises as lampposts are not needed when natural light will be at our disposal. Once this information is inputted and processed the Fuzzy Logic technology should keep the lamp post on during normal run time hours but once it hits a time when the area isn’t as populated the lights would then dim to conserve energy. Lastly when the sun begins to rise the artificial intelligence (A.I) will cut of all power to that area/sector as the lamppost would no longer be needed. This is a viable technology because it not only saves upon energy consumption which equates to the consumption of money as well, it also saves upon the cost of hiring employees as the artificial intelligence only requires a small group of employees to conduct maintenance rather than a whole fleet of city employees to manage lighting for a building. This is what makes the Fuzzy Logic Artificial Intelligence (A.I) a viable as well as valuable asset to society today. REFERENCES http://asq.org/learn-about-quality/process-analysis-tools/overview/fmea.html http://www.clemson.edu/ces/credo/classes/fmealect.pdf http://www.ncbi.nlm.nih.gov/pubmed/21302802 http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol2/jp6/article2.html http://www.control-systems-principles.co.uk/whitepapers/fuzzy-logic-systems.pdf http://faculty.ksu.edu.sa/73586/Documents/research_12.pdf http://www.renewableenergyworld.com/rea/news/article/2007/04/seeking-careers-in-therenewable-energy-field-47980 http://landartgenerator.org/LAGI-FieldGuideRenewableEnergy-ed1.pdf http://www.nrel.gov/docs/fy01osti/28369.pdf http://www.future-alternative-energy.net/renewable-energy-jobs.html http://www.renewableenergyworld.com/rea/home http://www.ucsusa.org/clean_energy/our-energy-choices/renewable-energy/public-benefitsof-renewable.html http://www.temple.edu/lawschool/iilpp/environmentalarticles/wang%20mingyuan.pdf http://www.energy.vt.edu/Publications/Incr_Use_Renew_Energy_VA_rev1.pdf http://www.iea.org/aboutus/faqs/renewableenergy/ http://wi-consortium.org/wicweb/pdf/Zadeh.pdf 26 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 27. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS CYCLOTRONS: FROM THE LAB TO THE FIELD IAN CROWDER Oak Park River Forest High School Chicago, IL Beyond its theories and research, particle physics has gained applications. By moving this science “from the lab to the field it now has practical uses for the real world. Cyclotrons are a prime example of how a grandiose science can be focused onto more realistic uses. In the past several decades, scientists’ understanding of physics has advanced exponentially. From James Chadwick’s discovery of the neutron in 1932 to CERN’s finding of the Higg’s boson in 2012, one of the key factors of this progress has been the development of particle accelerators. Particle accelerators are usually used to study matter and energy. By using electromagnetic fields, accelerators fire particles at an extraordinarily high velocity. All around the world, accelerators are renowned for their operations, but not all of them are the same. There are many different kinds of accelerator and each one has its own way of accelerating particles. In a simpler method, there is the linear accelerator, like the one at Fermilab in the United States, which fires particles in a straight line. Another model would be a synchrotron accelerator, like at CERN in Switzerland, which bends the trajectory of fired particles into a circular ring using electromagnets to direct and steer them on a bend. But, however well known these models are, they both retain one major inconvenience: they take up too much space. The largest particle accelerators can be measured for several miles. However, there are smaller, more convenient models. One of these models is called a cyclotron. A cyclotron is so compact that it could easily slip inside your pocket, and that is why it is such a revolutionary technology. In the 1920’s, physicist Ernest Lawrence thought that the format of a linear accelerator was impractical for lighter atomic particles, since it would need a vacuum tube spanning the length of several meters to have adequate acceleration. But this inconvenience inspired him to study how one could use the same energy potential multiple times instead of only once. Ernest Lawrence 27
  • 28. In the 1929, Lawrence got an idea while perusing through a German electrical engineering journal that belonged to a man named Rolf Wideröe. Wideröe sketched a device that would allow someone to use the same electrical potential twice. This could be done if one were to double the energy by switching from positive to negative potential in order to push ions and then pull them. Lawrence thought to use magnetism to bend charged particles into spiral trajectories and therefore pass them through the same accelerating gap over and over again. So he used a magnetic field to curve charged particles through the same horizontal plane in a vacuum over and over again. This made his particle accelerator small and disc shaped which was 2-D Schematic of Cyclotron more convenient than the long chamber of the linear accelerators that were so common in his time. A cyclotron is a circular accelerator except, unlike CERN’s piece of equipment; it is in the shape of a spiral and not a ring. It is an accelerator in which subatomic particles (ex. Protons) are charged in a gap at the center of the machine and are accelerated outward in a spiral trail. The spiral lies on a plane that is perpendicular to a magnetic field which is used to bend the path of the particles. Every time the particle reenters the gap during its spiral it slows down. So, to prevent that, a high frequency square wave creates an electric field that accelerates the particle at each passing of the gap. The cyclotron is a very unique particle accelerator with several different uses. It can be attached to particle colliders as a source for charged particles in a lab. When the cyclotrons propelled particles are shot into a separate machine and they collide with each other within a closed space. When this happens, the accelerator becomes a particle collider, and with this scientist can measure the effects of the impact. The consequences of these impacts can help researchers infer the properties of many sub-atomic particles. It can also be used in the field of medicine as cancer treatment through the means proton treatment source. One of the greatest benefits cyclotrons give to the medical feed is that they can be used in proton therapy to treat cancer. This process involves accelerating protons to penetrate the body and kill tumors through radiation damage, while minimizing damage to healthy tissue during the procedure. An application like this is a perfect display of how cyclotrons are useful even outside of the lab. The cyclotron has been around for almost eighty-three years now and it is still used even today. The invention of the cyclotron was revolutionary. This machine started as a tool to research particles too small for the eyes to see and it seem almost irrelevant to the macroworld that we live in but, over the years, it evolved to have a more direct impact on society through its contribution 28 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 29. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS to medicine. The convenience of such a compact yet powerful technology makes it a great example of how the very specific topic of particle physics can be applied to everyday life. REFERENCES "A Science Odyssey: People and Discoveries: Lawrence invents the cyclotron." PBS: Public Broadcasting Service. N.p., n.d. Web. 15 Apr. 2013. <http://www.pbs.org/wgbh/aso/databank/entries/dp31cy.html>. "An Early History of LBNL by Dr. Glenn T. Seaborg." Lawrence Berkeley National Laboratory. N.p., n.d. Web. 15 Apr. 2013. <http://www.lbl.gov/LBL-PID/Nobelists/Seaborg/65thanniv/12.html>. "Glossary Item - Cyclotron." Science Education at Jefferson Lab. N.p., n.d. Web. 15 Apr. 2013. <http://education.jlab.org/glossary/cyclotron.html>. "The First Cyclotrons - Ernest Lawrence and the Cyclotron: AIP History Center Web Exhibit." The American Institute of Physics -- Physics Publications and Resources. N.p., n.d. Web. 15 Apr. 2013. <http://www.aip.org/history/lawrence/first.htm>. Yarris, Lynn, and LCYarris@lbl.gov. "Ernest Lawrence's Cyclotron." Lawrence Berkeley National Laboratory. N.p., n.d. Web. 15 Apr. 2013. <http://www.lbl.gov/Science-Articles/Archive/earlyyears.html>. 29
  • 30. THE FUTURE OF AIR TRAVEL JARED SHERROD Delaware Academy of Public Safety & Security High School Newark, DE ORIGIN OF COMMERCIAL TRAVEL Today, Commercial flight is commonplace and expected. There is a large science behind the design, development, and execution of large commercial airplanes. However, when man began to take flight the majority of the planes where built for one or two passengers at the most. So when did commercial flight become commonplace. Commercial air travel had a humble beginning with a short airboat flight from St Petersburg to Tampa Florida. This St. Petersburg-Tampa Airboat flight lasted approximately 23 minutes in total. The flight only carried two passengers, but the passengers paid for a trip on a publicly scheduled commercial flight. The Benoist XIV boat plane, used for the first commercial flight, was considered at the time to be much safer than planes build to land on runways. After 1945 airline travel became a popular means of transportation starting the revolution towards were the business is today. The progression of the commercial airlines industry has driven aeronautical engineering technology to its limits and sometimes well beyond. I believe now and in the future air travel will remain the best way to travel long distances for business and pleasure, and I know there is room for improvement of airline systems and aircrafts to ensure continued progress of this transportation sector. MILITARY INFLUENCE ON THE AVIATION BUSINESS After WWI the United States was swamped with aviators. The military was a big influence on the aviation industry. In 1918 the United States Postal service was experimenting air mail with a training plane from the army. Private operators were the first to fly the mail. The United States Postal Service was the first to use private aviators to fly commercial routes. Eventually, due to numerous accidents, the United States Army was tasked with mail delivery for the US Postal Service. 30 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 31. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS In 1925 the ford motor company brought the scout aircraft company. Ford successfully made the first airliner which was named Trimotor. With a twelve passenger seat capability the Trimotor was a turning point for commercial aviation, because previous planes did not provide enough capacity to support a profitable airline business. After WWII aircraft manufacturers such as Boeing and Lockheed Martin started making bigger aircraft inspired by WWII bombers. They stripped out the military gear and replaced it with seats, storage and passenger comforts. This was successful tactics that lead to an air travel boom. In the 1950s the airline industry began to develop planes for air travel and introduced jet and turboprop propulsion. This was the start of the jet age with international flights becoming very popular. the same time period Juan Tripple began to build his own airline in America. He achieved his goal with the creation of pan American world airlines, Juan's fleet was made up of flying boats, with routes los angles to shanghai and Boston to London. Pan American and northwest airlines were the only airlines to operate international routes in the 1930s. This was due to the extraordinary aircraft the Boeing 247 and the Douglas dc-3 that made these companies very profitable even through the great depression. In 1970s the airline industry and planes had matured beyond any conceivable ideas back when the first commercial flight took place in 1914. These aircraft used turbine jet engines and new jumbo wide bodies. This made it possible to seat four hundred plus travelers on a 474 and fly them at 500 miles per hour to their destination. These planes also revolutionized commercial shipping with the capacity to carry literally almost anything by air. Also the TU-144 and the Concorde made supersonic flight between America to Europe a reality. These planes bridge the gap between the two continents in half the time it takes for conventional air travel. Airplanes these days are technologically advanced with many features that would overwhelm anyone researching them, but there is always room for improvements to make them better for the customer and the bottom line. There are three main areas that typically can be improved upon; Structure, Aerodynamics, and Propulsion. 31
  • 32. The military has a major influence in cargo planes, where they have been finding new material and finding different types of structure to make the plane more lightweight, maneuverable and more fuel efficient. The outcome of making these planes lightweight with more space is the increased profit and a lightweight body with less cost on fuel and better gliding ability if a dire emergency were to occur. With modifications to the plan to increase flight demand they also want to make travelers want to travel and feel comfortable on your flight, first class is very little for the technology we have now. First class should be the new business class and first class should be a whole around entertainment system with bars, pool tables, and mini golf green all in the plane. Flight propulsion is very important the most concern about fuel efficiency. Propulsion is the process of forcing an object to move. The average airline company spends about 47,288.2 million dollars on plane fuel a year. This is where most of their income goes to. There is a fuel that is made from algae. This is a good fuel it good for the environment and can be used for the most popular aircraft you see now. The alga is a good alternative because of its price. Even though there is not a lot of this fuel produced it is recyclable oil and it is very easy to make. Also kerosene is a short supply. This fuel can go very far in the long run. Boeing has already started to use this product for their test aircraft. The main body of the airplane is called the fuselage. The wings come out from the fuselage at the center of gravity and hold the airplanes weight during flight. Attached to the trailing edges of the wings are ailerons which are control surfaces used to control the plane on the roll axis and the flaps which enable the plane to easier descend on final approach. The wing may attach to the fuselage at the top highwing or at the bottom low wing or even cut the fuselage mid-wing Some early airplanes had both high and low wings bi-planes and in rare occasions all three tri-plane. Bi-planes usually only had control surfaces on one of their sets of wings. In some designs the wing may not attach directly to the fuselage at all instead attaching to the aircraft via stiff structural pieces. The tail usually consists of three smaller fins two horizontal and one vertical, called the horizontal and vertical stabilizers. Attached to the trailing edge of the horizontal stabilizers are the elevators these control surfaces are used to control the airplane on the pitch axis. Attached to the trailing edge of the vertical stabilizer is the rudder used to control the airplane on the yaw axis. Tails also come in many different configurations T-tails put the horizontal 32 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 33. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS stabilizers at the very top of the vertical H tails have two smaller vertical stabilizers and V tails have instead just two stabilizers with corresponding ruddervator control surfaces. A ruddervator is a butterfly, or Vee, tail which combines the effect of the rudder and the elevators. The result is movable surfaces that cause the aircraft to pitch up or down when the move together and deviate from a straight course when they move differentially. The engines may be mounted in many different places depending on the type of aircraft. Most civilian single-engine airplanes have the engine and propeller mounted on the front of the aircraft most light twins have them mounted on the wings jets may have engines mounted on spars below the wings or on the back section of the fuselage. Aircraft have been made in many strange configurations however sometimes the engine can be in the back with a pusher-prop configuration mounted on a spar well above the airplane or in the case of the Cessna Skymaster one mounted on the front and another in back. Aircraft designers have always been keen to try new and strange ideas. Landing gear typically comes in two types. Tricycle and tail-dragger. Tricycle-gear aircraft have the wheels configured just like a tricycle with one wheel under the nose. Tail-daggers instead have a small wheel on the tail. Airplanes may also be equipped with floats instead of wheels for water landings or skis for snow and ice landings. Additionally some airplanes are designed with a boat-shaped hull for water landings. Figure 1 Tail Dragger Landing Gear Aeronautical engineering in passenger aircraft is one of the hardest and confusing jobs in the aircraft industry now. My plane would be future airplane to a whole another level. It would have a jumbo jet style fuselage with room all around for comfort for tall people and room all around, so no need to crunch up in tight spots. The engines would be on a tilt wing axis for short take off abilities. With short take off abilities will make it possible to take off right from the gate. In the pilots cockpit would have all touch technology for good accessibility to the pilot. This would be a good resource but there would be a con in any failures where there would be another set of non fiber optic controls if an emergency. This paper discussed the parts of passenger aeronautical engineering and I hope I informed you about the many topics of this research paper. REFERENCES www.boeing.com/.../05/corp_innovative_thinking_05_07_12.html http://www.answers.com/topic/ruddervators http://www.answers.com/topic/ruddervators#ixzz2Ti3h77K6 http://travelforaircraft.wordpress.com/2011/04/13/early-flying-boat-early-airliner-benoist-xivairboat/ http://careers.stateuniversity.com/pages/818/Flight-Engineer.html 33
  • 34. UNDERGRADUATE PRESENTATIONS AT A GLANCE N EURAL N ETWORKS Nicholas Reid S ECURING B IG D ATA Alex Boardley TERRAIN C LASSIFICATION Jazzmine Bess THE E XPANSION OF C YBERSECURITY Avery Sherrod THE TRUTH ABOUT B IOARTIFICIAL O RGANS Jessica Boardley MODERN W EARABLE T ECHNOLOGIES Michael Bijou C LOUD T ECHNOLOGY Bryan C. Bemley and Chauncey Miller 34 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 35. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS MEET THE UNDERGRADUATE PARTICIPANTS NICHOLAS REID Baruch College Brooklyn, NY A 2011 Graduate from Medgar Evers College Preparatory High School, Nicholas Reid majored in math and science and received an Advanced Regent’s High School Diploma. Nicholas has been accepted into the fall 2013 semester at Polytechnic Institute at New York University where he will be majoring in Physics and minoring in Computer Technology. His presentation is “Neural Networks” ALEX BOARDLEY Wilmington University Wilmington, DE Alex Boardley is a sophomore senior at Wilmington University majoring in Computer Network and Security. His presentation is “Securing Big Data”. JAZZMINE BESS Florida A&M University Tallahassee, FL Jazzmine Bess is a fourth year computer information systems student. She will be graduating December 2013 magna cum laude. Her presentation is “Terrain Classification”. 35
  • 36. AVERY SHERROD Delaware Technical and Community College New Castle, DE Avery Sherrod is currently attending Delaware Technical and Community College where he holds a 3.2 overall grade point average and is on target to graduate with an Associates of Science degree in Computer Information Systems May 2013. His presentation is “The Expansion of Cyber Security”. JESSICA BOARDLEY University of Delaware Newark, DE Jessica Jean Boardley is freshman at the University of Delaware in Newark Delaware. She plans on becoming an officer and Mental Health Nurse in the Air Force. Her presentation is “The Truth About Bioartifical Organs”. MICHAEL BIJOU Bowie State University Bowie, MD Michael Bijou is an aspiring computer scientist from Arlington, Virginia. He is currently attending Bowie State University to obtain a bachelor's degree in computer science. His presentation is “Modern Wearable Technologies”. 36 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 37. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS BRYAN BEMLEY Bowie State University Bowie, MD Bryan Bemley is currently attending Bowie State University in Bowie Maryland. He is currently doing research in 3D modeling and animation, graphic arts and design, web design and development, mobile application development and high performance computing with a slight focus on visualization. Some of his future goals are to start his own 3D animation, web-development and research company and to develop user applications that can make the use of technology better for everyday life. His presentation is “Crash Course Cloud Computing”. 37
  • 38. NEURAL NETWORKS NICHOLAS REID Baruch College Brooklyn, NY As humans, we are naturally innovative; we find new ways to do old things, or create new things to help ease or help us perform task in life be it cooking, driving, or even diagnosing someone in a doctor’s office. As we grow, technology grows and it is becoming more prevalent in the medical field. There has been a sudden boom of Artificial Intelligent systems being applied in the medical area. This paper will examine Neural Networks. In particular, my focus will be on how Neural Networks relate to the cardiac medical field, and how they are used to help diagnose heart arrhythmias.. What are neural networks? As defined by dictionary.com neural networks are, “Also called neural nets. A computer model designed to simulate the behavior of biological neural networks, as in pattern recognition, language processing, and problem solving, with the goal of selfdirected information processing.” Neural networks can be split into two sub categories Biological Neural Networks (BNNs) and Artificial Neural Networks (ANNs). To understand how ANNs work we first need to understand how BNNs work or how our brains process information, learn new things. The human brain is made up of cells known as neurons. Neurons are basically composed of 3 things: dendrites, axon and a cell body. The Dendrites are little finger like projections, located at the head of the neuron that takes electrical inputs from other neurons. The cell body, mostly called the soma, is where the nucleus of the neuron is, and the axon is the usually long process of a nerve fiber that generally carries impulses away from the body of the nerve cell. At the end of the axon, it splits into tiny branches, at the end of each branch is a structure called a synapse. The synapse converts the activity in the axon into electrical impulses that inhibit or excite activity from the axon to electrical impulses that inhibit or excite the neighboring connected neurons by synaptic connections from the end of one neuron to the dendrite base of another. When a neuron receives excitatory input that is sufficiently large compared with its inhibitory input, it sends a spike of electrical activity down its axon. Basically, a biological neuron receives inputs from other sources, combines them in some way, performs a generally nonlinear operation on the result, and then output the final result. Learning occurs by changing the effectiveness of the synapses so that the influence of one neuron on another changes. In other words, the brain has developed ways for neurons to change their response to new stimulus patterns so that similar events may affect future responses (self-adaptation, learning). ANNs are an attempt to emulate the brain. Their architecture is designed analogous to how information is processed in our brains. ANNs contain many processors or units known also as neurons; the units are interconnected with each other. A neuron in an ANN is a device that carries out all the basic functions of a biological neuron; however, it is much simpler than a bio 38 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 39. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS neuron. Each Artificial neuron receives a set number of inputs xn from either external factors or other neuron, and a connection weight wn the inputs are multiplied by the connection weights and summed S. The result of this summation S is the input into transfer function F(S) and the output of this function is the result. While most ANN’s are made from the basic unit some differ by varying these units. The design of a neural network is a very complex and arduous process. The developer must go through a sort of trial and error phase in choosing a list or design decisions before picking a satisfactory design. The issues that occur in this phase are major concerns for ANN developers. Designing an ANN consists of: Putting the neurons in different layers, choosing the type of connections for inter (different layers) or intra (same layer) connected neurons, deciding how a neuron will receive an input and produce an output, and determining the strength of the connection weights by letting the network adjust itself using a training set. BNN’s are constructed in 3-Dimensional ways, made from very small components. These neurons seem to be able to inter-connect to each other limitlessly; however, in ANNs that is not the case nor is that possible…yet. ANNs are simple clustering of artificial neurons which are done by creating layers, which are then connected to each other. Mostly, all ANN’s have the same architecture. Some neurons interface with the world to take in its input while others interface with the world to output info, and others are hidden from us. Neurons are grouped into layers. The input layer is where neurons take input from the external environment-the world, the output layer consist, of neurons that communicate the result of the network to the external environment, and there are one, two, or more hidden layers in between these two layers. 39
  • 40. When the input layer neurons receive input its neurons output a result which become the input to other layers of the network; this process reiterates itself until a condition is met or the output layer is called. To measure how many hidden layers a network should have to function at its peak, developers leave that to trial and error. If you have too many hidden layers your network will develop over fit were if the network is given a training set instead of learning a pattern and deducting an answer the network will simply memorize the training set rendering that net useless. Neurons are connected to each other by a number of paths. These paths can be in most cases unidirectional or rarely a two way connection between two neurons. A neuron may communicate with neurons in its same layer or with neurons in a different layer both processed by different means. We can split up ANN communication into 2 sections inter-layer and intra-layer. Inter-layer is when neurons from different layers communicate with each other. There are different types of inter-layer communication, the first of which is called fully connected. Fully connected, each neuron on the first layer is connected to every neuron on the second layer. Next, Partially Connected, Each neuron on the first layer does not have to be connected to a neuron on the 2nd layer. Feed forward: The neurons on the first layer send their outputs to the second layer, but do not receive any input back. Bi-Directional, there are other sets of connections carrying outputs from the 2nd layer to the 1st. Hierarchical, the neurons of a lower layer can only communicate with the neurons from the layer exactly above it. Resonance: The layers have bidirectional connections, and they can continue sending messages across the connections a number of times until a certain condition is achieved. In more complex ANN structures, neurons communicate amongst themselves within a layer; this is intra-layer communication. There are two types of intra-layer communication Recurrent and On-center/Off Surround. Recurrent is the neurons within a layer are fully or partially connected to one another. After these neurons receive input from another layer, they reiterate their outputs with each before they are allowed to send their outputs to another layer. Generally, some conditions among the neurons of the layer should be achieved before they communicate their outputs to another layer. On-center/Off Surround is when a neuron within a layer has excitatory connections to itself and its immediate neighbors, and has inhibitory connections to other neurons. 40 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 41. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS Learning in BNN’s basically occurs by experience, the same with ANN’s. Sometimes called machine training algorithms, because of their changing connection weights, causes the network to learn the solution to a problem. The learning ability of a net depends on its type of architecture and what algorithmic method is used to train it. The training of nets usually consist of one of these 3 methods, Unsupervised learning, Reinforcement learning, Back propagation. Unsupervised learning is when the hidden layer of neurons must organize themselves without any help from the outside (the developer). In this method, the net does not have any sample outputs to which it can compare its outputs. This method is often coined by the phrase learning by doing. Reinforcement learning is the neurons in the hidden layer are randomly shuffled then reshuffled when its told how close it is to solving the problem. Reinforcement learning is also called supervised learning because it requires a so-called teacher which can be a training set of data or an observer who grades the nets performance. Back propagation is the net is not just given reinforcement on how its solving the problem, information about its errors are sent back through the net and is then used to adjust the connection weights between layers so as to improve performance. ANN learning can also be classified as being On-line or Off-line. In the off-line learning methods, once the systems enters into the operation mode, its weights are fixed and do not change any more. Most of the networks are of the off-line learning type. In on-line or real time learning, when the system is in operating mode (recall), it continues to learn while being used as a decision tool. This type of learning has a more complex design structure. In ANN’s, there are a variety of algorithms that are used to update connection weights. These algorithms are known as learning laws. A few of the major learning laws are Hebb’s Rule, Hopfield Law, The Delta Rule, and Kohonen’s Learning Law. Hebb’s Rule is the first and most known learning law created by Donald Hebb. The rule is: If a neuron receives an input from another neuron and if both are highly active (mathematically have the same sign), the weight between the neurons should be strengthened. Hopfield Law states,” if the desired output and the input are both active or both inactive, increment the connection weight by the learning rate, otherwise decrement the weight by the learning rate.” The Delta Rule (TDR) is another variant of Hebb’s Rule. TDR works by continually modifying the connection strengths in the net to reduce the difference between the desired output value and the actual output value. Kohonen’s Law, This procedure, developed by Teuvo Kohonen, was inspired by learning in biological systems. In this procedure, the neurons compete for the opportunity to learn, or to update their weights. The processing neuron with the largest output is declared the winner and has the capability of inhibiting its competitors, as well as, exciting its neighbors. Only the winner is permitted output, and only the winner plus its neighbors are allowed to update their connection weights. The history of neural networks date back to 1943 where neurophysiologist Warren McCulloch and mathematician, Walter Pitts wrote a paper on how neurons might work. In order to describe how neurons in the brain might work, they modeled a simple neural network using electrical circuits. After, in 1949 Donald Hebb wrote “The Organization of Behavior”, a work which pointed out the fact that neural pathways are strengthened each time they are used, a concept 41
  • 42. fundamentally essential to the ways in which humans learn. If two nerves fire at the same time, he argued, the connection between them is enhanced. As computers became more advanced in the 1950's, it was finally possible to simulate a hypothetical neural network. Moreover in 1959, Bernard Widrow and Marcian Hoff of Stanford developed models called "ADALINE" and "MADALINE." the names come from their use of Multiple ADAptive LINear Elements. ADALINE was developed to recognize binary patterns so that if it was reading streaming bits from a phone line, it could predict the next bit. MADALINE was the first neural network applied to a real world problem, using an adaptive filter that eliminates echoes on phone lines. In 1962, Widrow & Hoff developed a learning procedure that examines the activation value before it is adjusted. It is based on the idea that while one active neuron may have a big error, one can adjust the activation values to distribute it across the network, or at least to adjacent neurons. Despite the later success of the neural network, traditional von Neumann architecture took over the computing scene, and neural research was left behind. In the same time period, a paper was written that suggested there could not be an extension from the single layered neural network to a multiple layered neural network. In addition, many people in the field were using a learning function that was fundamentally flawed because it was not differentiable across the entire line. As a result, research and funding went drastically down. In 1982, interest in the field was renewed. John Hopfield of Caltech presented a paper to the National Academy of Sciences. His approach was to create more useful machines by using bidirectional lines. Previously, the connections between neurons were only one way. That same year, Reilly and Cooper used a "Hybrid network" with multiple layers, each layer using a different problem-solving strategy. Additionally in 1982, there was a joint United States - Japan conference on Cooperative/Competitive Neural Networks. Japan announced a new Fifth Generation effort on neural networks, and US papers generated worry that the US could be left behind in the field. By 5th generation computing Artificial Intelligence was being used. In 1986, with multiple layered neural networks in the news, the problem was how to extend the Widrow-Hoff rule to multiple layers. Three independent groups of researchers, one of which included David Rumelhart, a former member of Stanford's psychology department, came up with similar ideas which are now called back propagation networks because it distributes pattern recognition errors throughout the network. Hybrid networks used just two layers, these backpropagation networks use many. The result is that back-propagation networks are "slow learners," needing possibly thousands of iterations to learn. Now that we have a general grasp on what an ANN is, I will explain how they are used in the cardiac medical field, more specifically, in ECG analysis. First what is, an Electrocardiogram or ECG, as defined by mayoclinic.com is,” An electrocardiogram is used to monitor your heart. Each beat of your heart is triggered by an electrical impulse normally generated from special cells in the upper right chamber of your heart. An electrocardiogram — also called an ECG or EKG — records these electrical signals as they travel through your heart.” ECG examination has been used to diagnose cardiovascular disease. In many cases such as, Intensive Care Unit (ICU) or the 42 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 43. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS Holter System, recording and analyzing ECG patterns are necessary and automation of it can help reduce the physicians work and improve the efficiency of diagnosing. Typical ECG’s waves contain the P wave, QRS complex, and T wave in each heartbeat. Recognizing an ECG pattern is essentially the process of extracting and classifying ECG feature parameters, which may be obtained either from the time domain or transform domain. The features being frequently used for ECG analysis in time domain include the wave shape, amplitude, duration, areas, and R-R intervals. The problem with automating ECG is 3 things; the first is the non-linearity in ECG signals. Basically, the haphazardness and unpredictability in an ECG. The second is the large variation of anatomies. The third is how ECG’s are contaminated by background noises (signal interference) like electrode motion artifact and electromyogram-induced noise, which also add to the difficulty of automatic ECG pattern recognition. With all of these issues, ANN’s are the perfect solution to these problems. ANN’s are branded by their robustness and non-linearity, which is why it’s great at dealing with non-linear problems and good easiness when dealing with noise. ANN’s are also characterized by their adaptive abilities and recognition speed which is fast. A good example of how ANN’s are used in ECG automation is a study done by Lin He, Wensheng Hou, Xiaolin Zhen and Chenglin Peng at the Biomedical Engineering College of Chongqing University. In this study they chose 4 types of ECG patterns taken from the Massachusetts Institute of Technology – Beth Israel Hospital (MIT-BIH) Arrhythmia Database, and created 3 different neural networks SOM (Self-Organization Map), BP (Back propagation), and LVQ (Leaning Vector Quantization) to recognize the ECG patterns. The SOM ANN is an unsupervised self-learning neural network, which was developed by Teuvo Kohonen in 1981. A SOM net consists of one input layer and one competitive layer. The BP net was a supervised multilayer feed-forward net. The LVQ net is composed of one input layer, one hidden layer and one output layer. The neurons in the hidden layer and output layer only produce binary outputs. After training the nets and recording the results when they were put into action the overall accuracy for each net in its analysis was over 90%, the SOM net was the most accurate with an overall accuracy of 95.5%. The point of the study was to prove the effectiveness in neural networks being able to recognize ECG patterns and actually help doctors with their diagnoses, there are many other studies conducted at different institutions like, “Atrial fibrillation classification with artificial neural networks” by Sadik Kara, and Mustafa Okandan at 43
  • 44. Erciyes University,, “Classifying Multichannel ECG Patterns with an Adaptive Neural Network” by S. Barro, M. Fernandez-Delgado, J.A. Vila-Sobrino, C. V. Regueiro and E. Sanchez at University of Santiago de Compostela, etc… all these studies prove neural networks are beneficial to the cardiology field. In the future, I believe neural network applications will become an essential part of a cardiologist, “toolbox” as well within all ICUs at hospitals, their ability to interpret and recognize ECG patterns, help diagnose heart disease, predict myocardial infarctions, and are going to be an integral part in helping prolong mortality in people with or without cardiovascular complications. In conclusion, there has been a sudden rise in the use of neural networks in the medical field. In this paper we discussed what neural nets are, how they help improve cardiovascular health keying in on ECG pattern recognition and analysis, and if they have a foreseeable future in the cardio-medical field. REFERENCES Kara, Sadik, and Mustafa Okandan. "Atrial fibrillation Classification with Artificial Neural Networks." Pattern Recognition (n.d.): n. pag. 5 Mar. 2007. Web. Barro, S., M. Fernandez-Delgado, J.A. Vila-Sobrino, C.V. Regeuiro, and E. Sanchez. "Classifying Multichannel ECG Patterns with an Adaptive Neural Network." IEEE ENGINEERING IN MEDICINE AND BIOLOGY (1998): n. pag. Jan.-Feb. 1998. Web. He, Lin, Wensheng Hou, Xiaolin Zhen, and Chenglin Peng. "Recognition of ECG Patterns Using Artificial Neural Network." The IEEE Computer Society, n.d. Web. Klerfors, Daniel. Artificial Neural Networks. Rep. N.p.: n.p., n.d. Print. "Electrocardiogram (ECG or EKG) - MayoClinic.com." Mayo Clinic. N.p., n.d. Web. 18 May 2013. <http://www.mayoclinic.com/health/electrocardiogram/MY00086>. "Neural Networks - History." WWW-CS-FACULTY & STAFF Home Page (12-Apr-1995). N.p., n.d. Web. 18 May 2013. <http://www-csfaculty.stanford.edu/~eroberts/courses/soco/projects/neuralnetworks/History/history1.html>. 44 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 45. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS SECURING BIG DATA ALEX BOARDLEY Wilmington University Wilmington, DE WHAT IS BIG DATA? The definition is “a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications” According to SAS, Big data is a popular term used to describe the exponential growth, availability and use of information, both structured and unstructured (SAS). What does that mean? The volume of data that is being collected is a challenge to software applications. Software cannot obtain and process the data in a way we can use it efficiently and timely. So what companies uses this much data? Ever hear of Software AG, Oracle Corporation, IBM, Microsoft, SAP, EMC, and HP because each one of them has spent 15 billion on software firms only specializing in data management and analytics. This is all for a good reason, holding, using and keeping this data safe can be the difference between a thriving industry and an unsuccessful business. Analytics is important to invest in because it can give you the competitive edge against your competitors. A person in analytics uncovers hidden patterns in the big data, and try’s to pick up useful trends to help the company, all in all increasing revenue. VOLUME, VELOCITY AND VARIETY You may know there are 3 “V”s to big data, Volume, velocity, and variety. Volume meaning that it is almost impossible to understand the shear amount of data, velocity meaning data comes in fast and furious, and variety meaning data comes in all forms , structured un-structured and semistructured. This idea was created by 45
  • 46. Gartner, Inc. back in 2001. For a better understanding Gartner tweeted this, “Big data” is highvolume, -velocity and -variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making. The most interesting of the 3vs is variety. The ever growing volume of data attributed to years of data collection or the increasing use of text message from increase participation in social media, is contributing to the storage issues of today. How does one determine what data is relevant when looking at such a vast amount of data. Factors that contribute to the increase of data volume includes the constant streaming of data from social media. There are many family arguments based on the over use of a family data plan. This is just one example of the daily over use of data. What makes any of this data relevant? In many cases all of the data needs to be stored and protected without setting a priority on one set of data over another. The data is received in a variety of formats. This includes video, photos, emails, documents and a host of other unique files. Velocity is represented by the massive amounts of data that gets generated on a dialy basis. RFID tags are an example of solutions that contribute to the increasing velocity of data. WHY HACKERS PLANT VIRUSES? The bulk of all this data that’s out there is already owned by organizations, but it is unused. This data is called dark data. For example elevator logs help to predict vacated real estate. The organizations own this data, but it will remain unused. What are the reason hackers are infecting our computers? : Small businesses are ripe targets from hackers. See the chart to the left for more detail. It shows the distribution of attacks by company size, as reported in( Verizon's 2012 Data Breach Report). The results are staggering. Bad hackers are largely at war with small businesses: 70% of all attacks were perpetrated against tiny businesses with 100 employees or less. To gain access to an organization most valuable property, hackers are only getting better. A hacker is someone who seeks and exploits weaknesses in a computer system or computer network. Hackers can be motivated by many different reasons, including profit, protest, or challenge. The lure of financial gain has motivated cybercriminals to implement innovative new methods and to become more thorough with each passing year. Today, cybercriminals are sophisticated, evolving their craft and tools in real time. For example, malware created today often undergoes quality control procedures. Cybercriminals test it on numerous machines and operating systems to ensure it bypasses detection. Spammers, criminal syndicates, and other hostile entities are 46 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 47. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS switching their attention from conventional high-volume spam to carefully crafted, low-volume attacks that are more sophisticated, devious, and potentially costly. Instead of peddling counterfeit drugs or luxury goods, they’re launching phishing attacks, distributing email with fake links or malware attachments (such as keyloggers and rootkits) that enable criminals at remote locations to surreptitiously siphon or “infiltrate” valuable business data from an enterprise network. Hacking can even go as far as international. Considered to a hacker, we are all next door. The New York Times' announcement serious possibilities that China's military has stolen information from companies "involved in the critical infrastructure of the United States -its electrical power grid, gas lines and waterworks." VIRUSES AND VULNERABILITIES In the 1990s, the average personal computer user received one or two spam messages a day. As of August 2010, the amount of spam was estimated to be around 200 billion spam messages sent per day. Today there are nine major types of computer viruses that infect millions of computers every day. These viruses include, Macro Viruses, Boot Sector Viruses, Browser Hijacker, Direct Action Viruses, File Infector Viruses, Multipartite Viruses, Polymorphic Viruses, Resident Viruses and Web Scripting Viruses. They can be as simple as spamming your computer to deleting all of your files and stealing you’re your personal information like social security numbers and credit card numbers for their own personal uses. The reasons why they are called viruses are because they have the ability to replicate. Many viruses, which disguise themselves as tracking cookies, are meant to allow access to personal information that you give out over the internet. This is why it is good to always make the time to delete all the cookies in your history. When you are purchasing items from online, especially from non-recommended sites, there is the risk for identity theft. Viruses can also slow down your computer expressively, erasing information, destroying vital data, and even shutting down your computer. The risk is equally as present on a mobile device like a cell phone or tablet. Thanks to online, mobile transactions, social media trafficking and GPS coordinates we now generate over 2.5 quintillion bytes a day. It is predicted to increase by an outstanding 100% by 2015. Discovering that your intimate conversations, pictures or texts have been splattered across the Internet for all to see is not only an invasion of privacy, but can also be damaging to your personal life and livelihood. It is important that you don’t give out any important information over text messaging even to somebody you trust. You never know what your relationship will be with that person, if you must be prepared to change your passwords immediately. Also make sure you set a password to your voice mail. Most phones come with a default code to listen to your voice mail 47
  • 48. and that can be obtained from other phones. Lastly watch out for the applications you download on your device. Malware, which can still personal information, can be hidden within applications, more popular on android phones and jail broken apple products how can we prevent our computers from being hacked? Make sure you keep up to date on your software updates. Install firewall on your computer. Change your password every month. Purchase or download anti-virus software. Install anti software programs. Last make sure you delete emails from unknown sources. Now imagine a hacker having access to terabytes and terabytes of other people’s personal information and that’s where we have a problem. If companies possess big data they need to possess efficient ways to keep it safe. For large companies to keep big data safe they follow a multitude of steps. When the data is collected by the organization it is then sent to their cloud. An off site area were they store there big data. Every organization, for safety, should have another site to back up all of the information they have just in case of complication or system compromise. It may cost more money but you would save more in the long run. DATA STORAGE The data is stored in storage tiers. Tier 1 being the most important and efficient storage device because that piece of data is used most. It host mission critical servers and computer systems, with fully redundant subsystems (cooling, power, network links, storage etc) and compartmentalized security zones controlled by biometric access controls methods. Tier 2 being a little less important and tier 3 being the least with minimum checkups. How can companies that have big data keep it safe? With help from companies like Proof point security. Fortunately, there is a new approach to threat detection and remediation. Instead of pattern-matching against “known bad” email, Proofpoint’s security-as-a-service platform applies Big Data analysis techniques to continuously analyze billions of email messages and everchanging patterns of communication, enabling Proofpoint’s security-as-a-service platform to detect anomalous behavior TYPES OF BIG DATA SECURITY There are two different forms of big data security: Real-time big data security analytics solutions and Asymmetric big data security analytics solutions. Real- time solutions are built around a distributed architecture; made up of appliances designed for local streaming processing and collective parallel processing. Real-time big data security analytics solutions tend to collect and analyze old standby data like logs, network flows, and IP packets across the enterprise with a view of the data from L2 through L7. Asymmetric big data security analytics solutions can be 48 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 49. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS built on proprietary data repositories, but it is likely that all products will support big data technologies like Cassandra, Hadoop, and NoSQL over time. Security analysts will feed these solutions with batch updates containing terabytes of structured and unstructured data in order to look at historical security trends over long periods of time. Asymmetric big data security solutions will be anchored by machine learning algorithms, cluster analysis, and advanced visualization. BIG DATA MANAGEMENT There is some good news regarding the management of big data. Spam volumes are declining. They have dropped 68% year-over-year between February 2011 and February 2012 In a 2011 year-end report, IBM estimated that volumes had returned to their 2008 levels. Other estimates suggest far more dramatic declines: from an all-time daily high of 225 billion messages down to 25 billion—a drop of nearly 10 times to what it was. This is the 21st century, everything fun or beneficial runs off of ones and zeros. In fact 90% of the data ever used has been created in the past two years (Conner Ch. 3). Thanks to innovations of creating “the cloud” it has given attacks a bigger playing field to attack putting everybody else at a much higher risk. America is being attacked more and more by other nations. The biggest problem is “the attacks are changing from intrusion attacks to disruptive attacks” (Kerner ch.13). large industries need to be secure and be prepared for any cyber attack. Last month, a malware-infected site appeared to be the root cause of some high-profile hacking: Facebook, Microsoft, Twitter, and Apple were all compromised in the attack. Now, The Security Ledger has learned that the one infected site was not the only cause of these attacks — at least two other mobile application development sites were also attacked, and an undisclosed number of other, non-dev focused sites were part of the attack as well. The victims of the attacks also reportedly went well beyond just the technology segment — auto manufacturers, US government agencies, and a high-profile candy maker were also hit in the attack. Protecting Big Data archives presents an additional challenge based on the massive amount of data. A recurring backup is just not feasible. The massive amounts of data including videos and photos can go years without being used at all. The occasional query would only touch small components of the massive data source. The challenge is in the data size and the complex access pattern. As a result a disk only storage solution may not be practical. Object-based storage using cloud infrastructure could be the solution as these vendors have eliminated the key scaling issue represented in traditional file systems. The solution includes establishing a new-infinite disk based storage system that gets backed up to another disk based, near infinite capacity system. The new issue becomes the cost, which could be addressed with a tape backup. SECURING BIG DATA Securing data is one of the most important criteria to managing a big data environment, but it can also be difficult as a result of the magnitude and complexity of the data. The first step toward 49
  • 50. securing big data is to understand the storage patterns and how the data will be used. The concept of Big Data is not going away any time soon. If anything it will get bigger. Big Data is important to both the Business and the Technology side of any corporation, resulting in a perfect marriage. As recommended on Wired Innovative Insights there are five steps you can take to secure your big data. First assess your data storage environment and understand usage patterns. This will help you know where information is most vulnerable. Make sure that the data cluster is secured against attacks. The next step would be the keep controls close to the data source will create a more effective line of defense. The third step would to be have embedded security in the data center cluster in the form of access control such as RBAC. This way if the perimeter is breached, the cluster and sensitive data is still protected by a security wrapper. Encrypt both static and in transit data creating a continuous stat of motion as it travels from node to node. Lastly, incorporate a logging function into the cluster as mandated by most regulatory compliances for auditing purposes. REFERENCES Crump, George. How to Protect the Big Data Archive. Information Week. Received Apr 2013 www.informationweek.com/storage/disaster-recovery/how-to-protect-the-big-dataarchive/232901095 Big Data – What is It?. SAS Received Apr 2013 http://www.sas.com/big-data/ Krikken, Ramon. “Securing Big Data “ - The Newest Fad. Gartner http://blogs.gartner.com/ramon-krikken/2012/05/10/securing-big-data-the-newest-fad/ Kossman, Rachel. Understanding Primary Data Storage in the Cloud. Intel SearchCloudStorage http://searchcloudstorage.techtarget.com/podcast/Understanding-primary-data-storage-in-thecloud Vogt, Jim. Securing Big Data’s Future. Wired. http://searchcloudstorage.techtarget.com/podcast/Understanding-primary-data-storage-in-thecloud Hurwitz J., Nugent A., Halper F., Kaufman M. 2013. Big Data for Dummies. Hoboken New Jersey. John Wiley & Sons Inc. 50 2013 NATIONAL IT SHOWCASE PROCEEDINGS
  • 51. 2013 National BDPA Technology Conference IT SHOWCASE PROCEEDINGS TERRAIN CLASSIFICATION JAZZMINE BESS Florida A&M University Tallahassee, FL INTRODUCTION "The science of today is the technology of tomorrow." --Edward Teller. Science and technology are interdependent. Both are needed for the betterment of humanity. Exploration and innovation would not exist with the codependency of technology and science. This year I had the opportunity to experience firsthand the bond between the two. NASA sponsors, NASA USLI, a rocket competition every year for universities and high schools nationwide. The event is a school yearlong competition. This year is my second year on the team and I serve as Project Lead. The ultimate goal of the program is to build a rocket that will fly a mile high and contains a scientific payload. A scientific payload is any experiment that can be conducted in the rocket that will be recoverable. I am a third year computer information systems student and I lead a team of computer science and engineering students. My team and I developed a payload that incorporates technology and science. The research started in August and the competition will take place April 17-22. The experiment I’m conducting is a terrain classification experiment. The experiment streams a video from the rocket and at set altitudes takes still frames from the video and then sends the pictures to the computer to be analyzed. The computer will be able to determine the terrain of the still shots. Once the computer has analyzed the pictures the computer will be interacting with a mobile device on the ground that will receive pictures from the rocket and the classifications for each picture. SCIENCE VALUE AND UNIQUENESS This experiment was spurred by recent success with the mission Curiosity. Curiosity is currently testing and identifying the surfaces on Mars. Terrain exploration is important to understanding other planets. While, robots are normally used in terrain classification experiments, such as the rover Curiosity, my experiment uses pictures. My project allows for experimentation without the additional cost of using robotics. Also, the project contains an app which interacts with a computer a mile high. Integrating mobile technology with rocketry is a fairly new concept. The app will be designed to read data in real time. This experiment challenges the question how else mobile technology can influence rocketry. HARDWARE COMPONENTS The camera that I chose was the HackHD camera. This camera was chosen because of the quality of the pictures. The resolution on the camera is 1098 HD, the frame rate is 30 frames per second, and allows for composite video. The camera also comes with a 2 gigabyte microSD card. The 51