UBC Undergraduate Research

High altitude balloon with stabilized camera system Wasilenko, Lee Jan 10, 2011

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
52966-Wasilenko_Lee_APSC_479_2011.pdf [ 1.32MB ]
Metadata
JSON: 52966-1.0074456.json
JSON-LD: 52966-1.0074456-ld.json
RDF/XML (Pretty): 52966-1.0074456-rdf.xml
RDF/JSON: 52966-1.0074456-rdf.json
Turtle: 52966-1.0074456-turtle.txt
N-Triples: 52966-1.0074456-rdf-ntriples.txt
Original Record: 52966-1.0074456-source.json
Full Text
52966-1.0074456-fulltext.txt
Citation
52966-1.0074456.ris

Full Text

2  High Altitude Balloon with Stabilized Camera System    Lee Wasilenko APSC 479 Engineering Physics University of British Columbia January 10, 2011          3  Executive Summary  This report describes the design and testing of a camera stabilization system intended for use with on a high altitude balloon. The objective is to be able to control and stabilize a camera with enough precision to obtain clear photos of the stars and other celestial bodies. This report describes several possible mechanical designs but no clearly superior mechanical design emerged during the time allotted to the project. Tests were performed to determine an acceptable level of relative displacement between the camera and its subject. This was determined to be <0.5deg overall and can also be characterized as a rate <0.05/E(t) where E(t) is the shutter speed of the camera. The team constructed and tested a simple stabilization system with a gyroscopic rate sensor, an Arduino microcontroller, and a servo. The system behaviour was determined by PI code written by the team. The team found that this system was not sufficient to provide the stringent level of control required for a stable image. The displacement measured by the team was 5.7deg, which is 10 times higher than that required.  Further optimization and design research must be performed before the stability requirements can be met.             4  Table of Contents  Executive Summary   ....................................................................................................................................... 3 List of Figures   ................................................................................................................................................ 5 1. Background and Motivation   ................................................................................................................. 6 2 Project Objectives   ................................................................................................................................. 9 2.1  Specific Objectives for APSC 479  ................................................................................................... 9 2.2 Objectives for future HAB Launch   ............................................................................................... 10 2.3 Scope and Limitations   ................................................................................................................. 11 2.4 Organization   ................................................................................................................................ 11 3.0  HAB Launch Research   ..................................................................................................................... 12 3.1  Balloon Selection  ......................................................................................................................... 12 3.2  HAB Flight Trajectory Prediction   ................................................................................................. 12 3.3  Communications and Recovery   .................................................................................................. 15 3.4 Camera Selection   ........................................................................................................................ 15 3.5 Star and Region of Space Recognition   ........................................................................................ 16 4.0  Camera Stabilization   ....................................................................................................................... 17 4.1   Stability Theory and Requirements  ............................................................................................. 17 4.2 Mechanical Design   ...................................................................................................................... 24 4.3  Control Algorithm   ....................................................................................................................... 29 4.4  Description of Experimental Testing   ........................................................................................... 30 4.5 Results   ......................................................................................................................................... 32 4.6  Discussion of Results   ................................................................................................................... 32 5.0  Conclusions   ..................................................................................................................................... 34 6.0  Project Deliverables and Financial Summary   .................................................................................. 35 6.1  Deliverables   ................................................................................................................................. 35 6.2  Financial Summary   ...................................................................................................................... 35 6.3  Ongoing Commitments   ............................................................................................................... 35 7.0  Recommendations   .......................................................................................................................... 37 References   .................................................................................................................................................. 38  5  List of Figures  Figure 1: Payload Axis Definition. Positive Z-axis points in the direction of latex balloon   ........................... 7 Figure 2: Curvature of the Earth from 33km. Photo credit: David Stillman (4)   ............................................ 9 Figure 3: CU Spaceflight Trajectory Prediction   ........................................................................................... 13 Figure 4: University of Wyoming Balloon Trajectory Prediction   ................................................................ 14 Figure 5: 1.5deg of camera displacement, large view   ................................................................................ 19 Figure 6: 1.5deg of camera displacement, close up view   ........................................................................... 19 Figure 7: 1.0deg of camera displacement, large view   ................................................................................ 20 Figure 8: 1.0deg of camera displacement, close up view   ........................................................................... 20 Figure 9: 0.5deg of camera displacement, large view   ................................................................................ 21 Figure 10: 0.5deg of camera displacement, close up view   ......................................................................... 21 Figure 11: Shaky camera, large view   ........................................................................................................... 22 Figure 12: Shaky camera, close up view   ..................................................................................................... 22 Figure 13: Shaky camera with VR off, large view   ........................................................................................ 23 Figure 14: Shaky camera with VR off, close up view   .................................................................................. 23 Figure 15: First draft stabilization design   .................................................................................................... 25 Figure 16: Second iteration design   ............................................................................................................. 26 Figure 17: Full and close up view of iteration three mechanical design. Mockup courtesy of Bernhard Zender, UBC Engineering Physics Project Lab   ............................................................................................. 27 Figure 18: Alternate design with gyroscopic stabilizer. Mockup courtesy of Bernhard Zender, UBC Engineering Physics Project Lab   .................................................................................................................. 27 Figure 19: Stabilizer design using gyroscopic stabilizer and inherent stiffness. Mockup courtesy of Bernhard Zender, UBC Engineering Physics Project Lab   ............................................................................. 28 Figure 20: Control program flow diagram   .................................................................................................. 30 Figure 21: Components of experimental performance tests  ...................................................................... 31         6  1. Background and Motivation  The low cost and ready availability of a plethora of electronic components and semi-conductor devices has made it possible for the amateur, hobbyist and student to construct devices and perform experiments which in the past would have been prohibitively expensive and complex. Two endeavours which are benefitting from the semi-conductor revolution are the fields of amateur photography and high altitude balloon (HAB) projects. Photography has benefitted from the development of the CMOS and CCD technologies which make it possible to capture high quality images, view the result instantaneously, edit as desired, and store the image in a variety of digital formats. At the cutting edge the current state of the art in digital photography rivals that of film photography with much less expense and greater ease in editing. At the low and mid-range, digital cameras are inexpensive, light weight, easy to operate, and have a very high image quality relative to cost. This revolution in camera technology makes it possible for amateur photographers to capture images rivalling those of professional photographers and for aspiring photographers all over the world to use cameras in a variety of ways to capture images and moments that would otherwise be impossible or prohibitively expensive. A HAB is in general a large latex balloon filled with helium that carries a payload to altitudes frequently as high as 33km. HAB flights last between 3 and 5 hours and are terminated when the balloon bursts. The burst is caused by the expansion of the latex; as the atmospheric pressure drops it allows the gas inside the balloon to expand and stretch the latex which eventually bursts as the surface tension becomes too high for the material. Often the objective of a HAB launch is to perform some measurement or action while aloft and transmit location information back to Earth so that the launch team can recover the payload.   HAB projects have benefitted greatly from the large variety and low cost of electronic devices. In particular the variety of sensors and communication devices available make it possible for the amateur to design a payload that can take a variety of measurements while aloft and transmit detailed location information from a GPS chip back to the launch team by radio frequency signals so that the payload may be successfully recovered. The launches are relatively low cost and depending on the components used and measurements taken a launch can cost less than $200 (1). Recently there have been a number of student and amateur teams combining these advances in order to obtain high quality photos and video records of the HAB flight with spectacular results (2) (3) (4).  See Figure 2 for an example. These teams have been able to capture awe inspiring photos of the curvature of the Earth with a minimum of cost by taking full advantage of the advances made in semi-conductor devices and the ready availability of these devices thanks to vendors like Sparkfun (5). So far all of these teams have used a stationary camera rigidly housed in the payload taking photos at timed intervals and so have left the subject of the photos to chance. The project team would like to introduce greater control and flexibility to the camera system with the objective of capturing clear images of space objects such as stars, the Moon, and perhaps the Sun. Capturing images of these objects will require special techniques. In particular, in the case of capturing photographs of the stars it will be necessary to take a long exposure photograph where the shutter is held open for an extended period of time. In order to 7  capture a clear and unblurred image it will be necessary to construct a stabilization system to counteract the random motion of the payload and hold the camera still relative to its subject. The project team attempted to construct a camera stabilization platform which will counteract the random motion of the payload in 2 of the 3 payload rotational motions as well as provide a control platform capable of allowing the camera to point upward in order to take photographs of space objects above located above the platform. The stabilization system was intended to compensate for rotation in the X and Y axis as defined in Figure 1. Rotation about the Z axis is an important consideration for taking a clear image but is beyond the scope of this project. The project team recommends Z axis rotation be addressed by a future team responsible for the other HAB launch system components. The project team intends to keep this consideration in mind in the design of its stabilization system so that the system remains flexible and open to development for several options in the compensation of rotation about the Z-axis.  Figure 1: Payload Axis Definition. Positive Z-axis points in the direction of latex balloon  The project team attempted to use an electro-mechanical system to implement the stabilization platform. This system used an Arduino microprocessor fed by 2 digital gyroscopes to control 2 servo motors in order to cancel the rotational motion of the payload relative to the camera subject in the X and Y axes.  The project team also considered using a gimballed stabilization system but decided that this system is unsuited for this application due to its potential weight and difficulty of construction given available resources. The team’s goal was to create a stabilization system that would be lightweight and inexpensive. The servo motor system here proposed has the advantage of low cost and ease of mechanical construction over the gimballed system. This system is flexible and allows greater controllability in terms of camera direction and so is more ideally suited to meeting our objectives. 8  The project team intended to design a camera stabilization system that was not limited to application in HAB projects. Other potential applications could include stabilization for high speed or extreme sport photography, unmanned or remote controlled aerial vehicle photography or remote controlled robotic systems. Our objective is to design a platform that is flexible enough to be used in a variety of applications while keeping costs low and performance high. The structure and temperature of the atmosphere changes quite significantly between the Earth surface in the Troposphere and our HAB’s destination altitude in the Stratosphere at 33km. Temperature and humidity are the main concerns for the successful operation of our camera stabilization system during the HAB flight. According to research in atmospheric structure the temperature range our system will be dealing with is in the range of +30°C to -60°C (6) (7). However based on experiments and calculations performed by other HAB teams it is likely that our system will operate in a less extreme temperature regime where the payload internal temperature may in fact not drop below -15°C if the payload is well insulated (8). The temperature range is extremely important for a successful HAB launch and recovery because the batteries that power the system as well as the electrical components themselves are not designed to operate properly in the extremely cold temperature range below -20°C. This project is a self sponsored project. This means that the project team itself is responsible for the resources and information required for the completion of the project in conjunction with the UBC Engineering Physics Project Lab. The resulting design and prototype of this project will remain in the possession of the project team with the intension that it will be used in a future HAB launch by a team of UBC students. The rights to any photographs and data resulting from a HAB launch using the stabilization and camera system described in this proposal remain the property of the project team and the HAB launch team. Our objective with this project was to obtain high quality photographs of space objects while developing a flexible stabilization platform. It is very exciting to construct something that will touch the edge of space so we hope to eventually complete a HAB launch and recovery with a full complement of sensor systems so that we can model and recreate the flight trajectory and monitor the environmental conditions experienced during flight. The project teams intends this to be a fun and educational experience that will edify and satisfy the curiosity of the students involved and benefit the Engineering Physics Project Lab with educational and promotional materials 9   Figure 2: Curvature of the Earth from 33km. Photo credit: David Stillman (4) 2 Project Objectives  The ultimate objective of this undertaking is a successful HAB launch with a variety of sensor systems taking data. The project team proposes this objective will be met in several phases. The first phase of this plan is in designing, constructing and testing a camera stabilization system that will clearly meet the requirements of the APSC 479 course. In parallel with this phase the project groups intends to be gathering data and considering other aspects of a full HAB launch. The second phase of this plan involves designing and constructing the rest of the flight, sensor, communication and data recording systems required for a successful launch and recovery. The third phase will involve the actual launch and recovery of the HAB. The fourth phase and final will involve compiling, analysing and presenting the data obtained by the camera and sensor systems and publishing or displaying that data where appropriate. The team hopes to be able to reconstruct the flight trajectory, the environmental conditions, and the forces experienced during the entire HAB flight.  2.1  Specific Objectives for APSC 479  At the end of the APSC 479 portion of this project we had intended to have achieved the following: 1. Design a 2 axis camera stabilization system that is mechanically simple, light weight, consume little energy. It must also be low cost and suited to a variety of applications. System must operate in temperatures below -15°C and cost less than $150 Canadian dollars for 2 axes of control. 10   2. Construct the camera stabilization system out of parts readily available from internet vendors like Sparkfun and Ebay and manufacture parts only when necessary. This includes writing all of the code required to control the servo motors via data provided by the digital gyroscopes and interpreted by the Arduino microcontroller.   3. Test the camera stabilization system’s response to rotation in the 2 selected axes. Attempt to determine whether this system will be sufficiently stable to take clear images of celestial objects. Additional objectives for the APSC 479 portion could include: 1. Performing the necessary camera “hacks” in order to have the camera behave in the fashion we require to take photos at timed intervals as well as taking long exposure photos at the appropriate time.  2. Write code to have the stabilization system aim up at celestial bodies at the proper time in order to take long exposure photographs. This may require input from other sensor systems (ie a pressure sensor).  3. Test the operation of the stabilization system in a cold environment and determine if the design is suitable for use in a HAB launch.  2.2 Objectives for future HAB Launch  There is a wide range of objectives that could be completed by a HAB launch. Some examples of objectives a future HAB team may like to meet include: 1. Design and construct HAB flight computer and sensor network. This computer would include GPS system, payload internal temperature sensor, external temperature sensor, barometric pressure sensor, 3- axis accelerometer sensor, magnetic heading sensor, humidity sensor, radio frequency communication system, microprocessor control and data storage system.  2. Payload enclosure design and construction including balloon and parachute system. The enclosure must be lightweight, insulated, and rugged enough to survive the impact of landing. The enclosure will also need to accommodate the sensor and camera systems and have a well 11  defined centre of gravity for correct operation and placement of the 3-axis accelerometer sensor. A variety of appropriate balloons can be purchased from Kaymont (9).   3. Choose launch location and predict flight trajectory. This includes determining the relevant governmental guidelines for a HAB lauch.   4. Compile, analyse and present flight data. It should be possible to reconstruct the entire flight trajectory from the information provided by the sensor network. A 3D computer simulation of the flight could then be constructed and presented along with graphs, photos, and video of the flight for educational and promotional use.  5. Several camera or sensor systems could be employed to take a variety of photographs in different wavelength regimes. For example the team could obtain infrared camera photos of the Sun or the Earth’s surface.   6. The payload could include radio repeaters to be used in long range radio communication with student teams at other universities like University of Calgary or Waterloo.  2.3 Scope and Limitations  The scope of this report is limited to the construction, methodology, and testing of the camera stabilization system. This report will also include research performed on other aspects relating to a HAB launch. It will not include a discussion of the relevant government regulations concerning the launch of weather and high altitude balloons.  2.4 Organization This report will discuss general considerations and research relating to a HAB launch and then move on to discussing a specific component of the flight system: a camera stabilization system. The project team provides mechanical design theory and potential design and then tests an experimental setup for its ability to meet stability requirements. Finally, recommendations are made for the improvement of stabilization system and next steps in the HAB project are suggested.  12  3.0  HAB Launch Research   3.1  Balloon Selection  The project team recommends using a latex “sounding balloon” for the HAB launch. These balloons can be purchased from meteorological balloon supplies like Kaymont (9). Kaymont has balloons in a variety of sizes but for a HAB launch the size of our balloon will determine the allowable mass of our payload. As an example we can choose the KCI 1200 which has a recommended free lift of 1190g, a lift off diameter of 179cm and will require 2.99 cu.m of helium to fill. The balloon will burst at an impressive 33.2km altitude at 7.3hPa of pressure when the balloon reaches a diameter of 863cm. The cost is approximately $60 USD plus shipping per balloon.  3.2  HAB Flight Trajectory Prediction  In the design of the payload and component systems it is important to understand the conditions the HAB will be subject to during the flight. Launch and landing locations must be considered so that the payload and its load of data can be safely recovered. There are several balloon flight trajectory predictors available on the internet. One from Cambridge University Spaceflight allows you to select a location on a map as well as choose some flight parameters like ascent rate, descent rate, burst altitude and it will then calculate your balloons motion based on wind and weather predictions (10).  Figure 2 shows the trajectory prediction from CU Spaceflight with a launch at 9:00am on September 27, 2010 in Bradner outside of Vancouver, ascent rate of 5m/s, descent rate of 5m/s, burst at 30,000m. 13   Figure 3: CU Spaceflight Trajectory Prediction  For comparison, another trajectory computer can be found from the University of Wyoming’s College of Engineering (11). This flight calculator takes latitude and longitude values and a burst ceiling, and spits out a Google Earth .kml document. Unfortunately it does not allow much flexibility in the prediction window therefore Figure 3 shows the resulting .kml file uploaded in Google Earth with an 18hour prediction window and the same launch coordinates as the CU Spaceflight predictor. 14   Figure 4: University of Wyoming Balloon Trajectory Prediction  The two predictions show a large disparity in the landing location. The CU Spaceflight prediction has the payload touching down somewhere in Manning Park while the UWYO prediction shows the payload touching down outside of Penticton. However, since we don’t know the ascent and descent rates used by the UWYO calculator it is impossible to make an accurate comparison. In either case the distance travelled is on the order of hundreds of kilometre and this is an important consideration when selecting a radio. A radio with 1W of power can transmit approximately 65km (12) so the launch team must attempt to remain within this range. If the balloon is at its maximum elevation of 33km then the team must be no more than 56km from point on the earth directly below the balloon. If the balloon is at 18km altitude, the point where the team will no longer receive valid GPS coordinates, then the team must be no more than 62.5km from the point on the earth directly below the balloon in order to receive position data. Another important factor in the descent rate is the size of the parachute and weight of the payload. There are several online calculators used by model rocket hobbyists to calculate appropriate parachute size. The simplest and most straight forward of these calculators is from Essence’s Model Rocketry Reviews and Resources (12). This calculator allows you to simply enter the weight of your payload and assumes that you are looking for a safe descent rate of 3.5m/s to 4.5m/s, it then tells you what range of 15  parachutes sizes are appropriate. For example, with our 1190g KCI 1200 payload we would want a parachute 126.4cm to 162.5cm in diameter. In either case it is important to realize that the lower mainland is not the ideal location for launching a HAB due to its geographical location sandwiched between the ocean and the mountains and an international border. The project team is fully trained, equipped and prepared to search for the payload in the mountain region but we think it’s better not to take the risk of losing the payload altogether. Several of the HAB launches documented on the internet have lost or have nearly lost their payloads (2) (13) (14). Also, it is important to consider road access in choosing a launch and landing site. A good site should have many roads that are regularly spaced and go in predictable direction. Unless the payload is designed to float and resist water the location should also have few lakes and rivers. Therefore it would be better to launch the HAB in a prairie province such as Alberta. Radio communication range will also benefit from the lack of obstruction in a prairie region. Although these considerations are beyond the scope of the APSC 479 portion of the HAB launch, the launch and landing conditions are nonetheless a consideration in the design of a durable camera stabilization system.  3.3  Communications and Recovery  In order to recover the payload at the end of the flight the project team recommends a redundant radio communications system. The payload should include a GPS chip which feeds serial location data to a radio transmitter. This position information should be broadcast at regular intervals and recorded by the launch team so that the team can monitor the flight trajectory and position the recovery vehicle. Commercial GPS chips are limited by law to not operate above 18km or at speeds greater than 515m/s so the team cannot receive valid GPS data while the payload is above this altitude. In case the GPS unit fails the team recommends that a secondary location method be employed. A good secondary location method would be to use the strength of the radio signal broadcast by the payload. This method is similar to the method used by the avalanche transceivers of backcountry skiers. Using a directional antenna and an audio output with variable steps of amplification the launch team can sweep the skies to locate the payload during flight. The team will have to consider corrections for the curved flux lines of the radiating antenna. When the payload has landed its radio signal strength will be significantly reduced so it is imperative the team has a good idea of what area to search. It may also be possible to launch a smaller secondary expendable balloon equipped only with a simple radio repeater that may pick up and repeat the payload’s GPS signal back to the team to aid in recovery.  3.4 Camera Selection  The project team intended the design to remain flexible enough for use with a variety of digital cameras but there are some other considerations which will influence our choice of camera. In particular we 16  must choose a camera which can easily be controlled via software. Ideally this control will be exerted by the microcontroller but failing that we will program the camera to run scripts. This requirement virtually guarantees we will be using a camera made by Canon because of the existence of the Canon Hack Development Kit (CHDT) (15). The CHDT enables Canon Powershot camera users to write scripts and enable features not usually available in an off-the-shelf Powershot. For our purposes two relevant scripts include the USB remote shutter release and the interval timer script. The USB shutter release could enable us to communicate serially with the camera when we want to take long exposure photos at the zenith of the HAB flight while the interval timer script will have the camera take photos at evenly space intervals during the other phases of flight. It remains to be seen whether CHDT will give us the control we require. Hands-on testing is required.  3.5 Star and Region of Space Recognition  If the project team is successfully able to obtain high quality images of stars and space there is a service provide on Flickr called Astrometry.net which allows you to upload your star and sky photos and have the location and contents of your photos solved by the program (16). This is a fun and interesting program which will allow us to easily identify the stars we have captured as well and determine where our camera was pointed in flight. This information will form part of the data analysis portion of a HAB mission.          17  4.0  Camera Stabilization  4.1   Stability Theory and Requirements  In order to obtain an unblurred image the camera must remain relatively stable relative to its subject while the shutter is open. Acceptable levels of vibration depend on the shutter speeds and features of the digital camera. For example, a camera with an image stabilizing lens will produce a much more clear image in the presence of vibration than a lens without this technology. The CMOS or CCD sensor chip of a digital camera works by collecting photons of a certain energy at each of the photoactive sites on the surface of the chip. Depending on chip design there is often a filter at each photoactive site for each of the colours red, green and blue (RGB). The energy of these photons is then interpreted as the colour intensity value for that particular colour and the sum of these three component s determine the RGB value of each pixel in the image. In an ideal image the camera is perfectly still relative to its subject so that the light reflecting off of the subject is collected accurately by the CMOS or CCD chip. If there is relative motion between the subject and the camera while the shutter is open the image will appear unclear or blurry as light from one location on the chip “bleeds” onto its neighbours and distorts its RGB value. Theoretically this means that for a clear image the shift in relative position of camera and subject should be less than one pixel per exposure. In practice it is possible to obtain an image that appears unblurred to the naked eye even if there was a slight shift in the relative position while the shutter was open. The intensity of the blurring is a function of shutter speed, lighting levels, and the relative velocity of the camera and subject. In a longer exposure photo with a low light subject whose lighting levels do not vary greatly over the displacement distance and a shutter speed greater than several seconds in length a displacement which lasts only fractions of a second and then restores the initial relative positions will not have a large impact on the blurriness of the image. However, a displacement which is a significant fraction of the shutter speed in duration or which does not restore the relative positioning or in which the lighting levels vary greatly over the displacement distance will result in a blurry image. Image size and resolution also plays a role in the perception of an image. A large image which is viewed from a distance will appear relatively clear unless it is viewed adjacent to a truly clear version of the image. The blurring only becomes apparent as one gets closer to the image and notices the lack of object boundary definition and contrast characteristic of an unclear image. Photo editing software packages such as Adobe Lightroom and Photoshop can also aid in sharpening a slightly unclear image using the special features included in such software. The process introduces some noise into the digital image but in general it does not greatly degrade overall image quality. In order to determine what level of stability was necessary to obtain a relatively clear image the project team took a variety of photos and examined their clarity. The images were obtained with a Nikon D90 DSLR camera with a Nikkor 18-105mm f3.5-5.6 DX VR lens mounted on a tripod.  The vibration 18  reduction, VR,  function of the lens was turned on for all but one of the photos so that the team could evaluate the effectiveness of a lens with a built in image stabilization function. All of the photos were taken with a 10 second exposure. In order to quantify the displacement of the camera relative to the subject the tripod was place 1 meter from the subject and then smoothly and uniformly rotated up a distance determined by the desired angle of displacement using simple trigonometry. A laser pointer and a meter stick were used to record the displacement. For example, to test effect of a 1.5deg displacement during exposure the laser pointer dot was moved over the duration of the exposure from its initial position at 72cm up 2.62cm as calculated by the equation a*tan(x) =b.        19   Figure 5: 1.5deg of camera displacement, large view  Figure 6: 1.5deg of camera displacement, close up view 20   Figure 7: 1.0deg of camera displacement, large view  Figure 8: 1.0deg of camera displacement, close up view 21   Figure 9: 0.5deg of camera displacement, large view  Figure 10: 0.5deg of camera displacement, close up view 22   Figure 11: Shaky camera, large view  Figure 12: Shaky camera, close up view 23   Figure 13: Shaky camera with VR off, large view  Figure 14: Shaky camera with VR off, close up view 24  Figures 5 to 8 show an unacceptable level of blur with any displacements greater than 0.5deg. Figures 9 and 10 show a marginally acceptable level of blur where large letters and numbers are still legible while the 1mm scale of the meter stick is not legible. When viewed from a distance with no comparison to a clear image this photo is marginally acceptable. Figures 11 and 12 show the result when a small impulse is applied to the camera and it is allowed to oscillate slightly on the tripod while the exposure is taking place and the VR function is on. The maximum displacement here corresponds to 0.35deg of camera motion. The large view image is relatively clear and legible. The close up view image is slightly blurry but not distractingly so and the 1mm markings on the meter stick are still legible. The project team considers this an acceptable amount of blur. Figures 13 and 14 show a similar shaky camera exposure with the VR function turned off. The image is similar to the 1deg displacement in the amount of blur and which the project team considers an unacceptable level. Based on this information the project team considers a camera angular displacement of less than 0.5deg during exposure an acceptable level of performance for the stabilization system if the camera features a lens with an image stabilizer. Expressed as an equation we can say that the acceptable rate of rotation 𝜔 must be less than or equal to 0.05deg divided by the exposure time E(t): 𝜔 ≤ 0.05°/𝐸(𝑡) To derive this condition we simply consider that we had E(t) = 10s and found an acceptable angular displacement of <0.5deg so the acceptable angular rate is <0.05deg/s. To get a long exposure photo of the stars it will be necessary to control the tilt of the camera so that the camera does not include the Earth in the image since the brightness of the Earth will cause the auto exposure function of the camera to under expose the stars. It is clearly also important to have the camera point away from the Sun for the same reason. In the photos and video provided by David Stillman’s HAB launch the Earth occupies about 50% of the total image (4). The Canon SD 850 camera has a 35mm focal length in 35mm film equivalent and has a sensor which is 24mm long in the vertical direction.  The equation for the vertical viewing angle 𝛼 of a camera is 𝛼 = 2arctan ( 𝑣2𝑓) Where 𝑣 is the sensor or film size in the vertical direction and f is the focal length in 35mm equivalent. Computing the vertical viewing angle for the SD 850 we obtain 37.85deg. Since the Earth occupies approximately 50% of the image at 33km altitude we therefore have to tilt the camera 19deg vertically to remove the Earth from the image.  4.2 Mechanical Design  The project team considered a variety of mechanical design for the implementation of the stabilization system. The objective was to keep the system as simple as possible and use inexpensive, readily 25  available parts from vendors such as Sparkfun. The main components selected for the majority of designs included digital gyroscope sensors, servo motors and an Arduino microcontroller. These parts are inexpensive and relatively versatile to fit a variety of mechanical implementations. The layout and shape of the payload will play a critical role in determining the suitable stabilization system or vice versa so the team designed and considered several options for mechanical implementation. The basic idea for the design was to use a gyro and servo for each axis of rotation we wished to stabilize. In the draft design this was done using a rotating bracket set inside a rotating enclosure as shown in Figure 15. This design suffers from several flaws which would make PID control tuning difficult such as the location of the servo on the axis of the center of mass. A better design would incorporate some kind of gearing and utilize any available inherent stability and balance.  Figure 15: First draft stabilization design  The next iteration of design shown in Figure 16 improved on the location of the servos and reduced the amount of material in the mechanical design but still failed to provide much consideration for inherent stability and balance. It was thought that this configuration would result in a very difficult PID system to tune and that the response time would be much less than that required to meet our stability criteria. For this reason the project team decided to keep working toward a more theoretically sound mechanical design instead of constructing and testing the early iteration designs. 26   Figure 16: Second iteration design  Improvements on these design ideas came with the help of Jon Nakane and Bernhard Zender of the UBC Engineering Physics Project Lab. Figure 17 is an improvement over the second iteration design shown in Figure 16 which features two servo motors to control the camera about the Z and Y axes. This design also brings with it the idea of separating the still camera system from the rest of the payload. This idea would allow more freedom in designing the camera stabilization system than the first two iterations shown above because we do not have to consider the placement of radio and data components. The radio and sensor payload is not shown in any of the figures and its design is beyond the scope of this recommendation report. The alternate design shown in Figure 18 takes this design a step further by adding a gyroscopic stabilizer to the camera platform which will provide some inherent stability due to the high angular momentum of the spinning gyro. This is a stabilization feature common in high level and professional stabilization units. 27   Figure 17: Full and close up view of iteration three mechanical design. Mockup courtesy of Bernhard Zender, UBC Engineering Physics Project Lab  Figure 18: Alternate design with gyroscopic stabilizer. Mockup courtesy of Bernhard Zender, UBC Engineering Physics Project Lab 28   Figure 19: Stabilizer design using gyroscopic stabilizer and inherent stiffness. Mockup courtesy of Bernhard Zender, UBC Engineering Physics Project Lab  29  Figure 19 is a design idea which includes the gyroscopic stabilizer and adds a lightweight stiffening system shown by the diamond shaped outline. The intension here is to counteract the tendency for the payload to twist and spin and thereby provide more inherent stability so that the active stabilization system will not need to play as large a role. The open enclosure concept in these last three designs gives the camera a wide and unobstructed view with which to capture images satisfying the 19deg of tilt required to clear the image of the Earth. It has the disadvantage of being very susceptible to the environmental changes in temperature and moisture the payload will experience during the flight. Once prime consideration is the reduced battery caused by low temperatures. One solution may be to use a slow exothermic chemical reaction heater on the battery pack similar to those used by skiers to keep hands and feet warm in gloves and boots. It may be possible to decrease the tendency of the payload to rotate by changing the balloon attachment mechanism. For example using two strings to suspend the payload will reduce the tendency of the payload to twist relative to the balloon. It may also be possible to use two balloons side by side to set the payload aloft thereby reducing the tendency of the payload to twist. These methods will also require more complicated cut down procedures however because it is desirable to have both balloons release the payload simultaneously so that the payload comes down quickly and does not drift more significantly from the launch point. The 19deg of rotation required to remove the Earth from the image sets a limit on the gear ratios allowed by the mechanical design. The range of motion of the servo is 180deg so a gear ratio of 4:1 with a maximum camera tilt of 22.5deg will allow the required 19deg of tilt and still leave 3.5deg available for stabilization.  4.3  Control Algorithm  The project team tested several variations of PID control code in testing and tuning the stabilization system. The system reads angular rate data from the electronic gyros and uses that data to output position data to the servo motors to counteract any motion experienced by the gyro. The program flow diagram is shown in Figure 20. 30   Figure 20: Control program flow diagram  The team tested several PID control and servo libraries for the Arduino platform but in the end found the best results were obtained by writing simple a PI control program so that the program would run quickly on the 16MHz Arduino Uno board.  4.4  Description of Experimental Testing  The project team constructed a simple test platform with the intension of quantifying the performance of the servo control code. The apparatus consisted of a mirror attached perpendicularly to the gyro sensor. This combination was then mounted on top of the servo motor. In theory if the body of the 31  servo is rotated the mirror should remain in its original orientation relative to the test bench. The servo motor has a range of 0 to 180degrees and the mirror was positioned in the middle of this range at 90 degrees. A meter stick was attached horizontally to a flat vertical surface 1m from the mirror. A laser pointer was then fixed so that its dot would reflect off the mirror and strike the 50cm mark of the meter stick. The components of the test are shown in Figure 21.  Figure 21: Components of experimental performance tests  The team was able to perform two performance tests using this setup. The first test was a measure of the ability of the control system to return to its origin position after a 90deg displacement. For this test the body of the servo was rotated 90deg and then returned to its original position. The laser dot’s deviation from the 50cm mark was then recorded. The second test involved attempting to quantify the amount of stabilization provided by the servo as the servo body is rotated about the original position. This test was performed at various rates of rotation in order to quantify the limitations of the system and observe the deviation of the dot from the 50cm mark.  32  4.5 Results  Both tests were performed several times in order to confirm the data being taken was reproducible and that mistakes were not repeated. The first test was performed by setting the mirror more to once side of 90deg so that there would be enough travel in the servo to perform the test without running into the limits of the servo motor. The servo body was then carefully rotated at a consistent rate 90degrees from the origin on the underlying graph paper and then returned to its original position at approximately the same rate. During the course of the rotation to servo would attempt to keep the dot in the same position but would oscillate wildly with amplitude of approximately 10cm which corresponds to 5.7deg of servo angular displacement. Eventually past 45deg the dot would travel outside the range of the meter stick. As the servo body was returned to its original position the dot would return also. At the final position where the servo body is returned to its origin the dot would reside at approximately the 25cm mark of the meter stick. The 25cm difference between beginning and end position corresponds to an angular displacement of 14 degrees. The second test was more difficult to quantify due to the lack of a reliable and available way to consistently track the angular rotation rate applied to the servo body. The gyro sensors used in this project are the MLX90609 model with a maximum angular rate of 70deg/s. This is a slow angular rate chosen to best match the slow rotation observed on David Stillman’s HAB launch video. The smaller angular rate maximum corresponds with higher accuracy in the output. The project team rotated the servo body about the origin using three approximate rates of rotation: 5deg/s, 10deg/s, and 30deg/s. At 5deg/s the servo would not activate consistently so the dot would disappear from the measurement area. At 10deg/s the servo would activate consistently and would oscillate about the 50cm mark. Eventually the dot would drift to one side or other and off of the measurement area as the inherent sampling errors piled up. The case for 30deg/s was similar to the 10deg/s result  with the dot fluctuating wildly about 50cm and then eventually shifting to one side or other. In all case while the servo body was in motion and the servo was actively stabilizing there would be a wild oscillation of approximately 10cm amplitude, 5.7deg displacement, as in the case of the first test.  4.6  Discussion of Results  The results obtained clearly show that the stabilization system falls far short of the <0.05deg/sec rotation required to obtain a clear long exposure photograph. The tests performed lack an accurate way to test the difference of applied angular rotation rate and the apparent rotation rate of the mirror but it is clear to the project team that many improvements must be made if this stringent requirement is to be met.  A test of this difference is important to establish a quantified determination of system improvements or optimizations. For example, it is difficult to determine if changes to the constants in 33  the PI control system result in improved performance without this information. This test could be performed by attaching the servo body to a controlled motor programmed to oscillate between two points at a consistent angular velocity. The experimenter could then systematically attempt to minimize the apparent rate relative to the applied rate. It would be useful to perform the measurement on an optical table with the necessary equipment to align the laser beam so that it does not shift over the course of a measurement and so that the incident and reflected beams lie in the same plane. The servo body could also be mounted to such that it is free to rotate but no translational motion can take place. In this way less error will be introduced into the displacement measurement. The project team feels that this error is small relative to the error caused by other sources in the experiment however it could be a limiting factor once a more acceptable level of performance is reached.            34  5.0  Conclusions  The results of the experimental testing show that there is a large amount of improvement to be made to the system if the <0.5deg displacement requirement for a clear image is to be met. Some of these improvements could be made with the existing hardware while others may require replacing some components of the system. From the first experimental test it is clear that some data is being lost during the operation of the stability system such that the system does not return to its starting location when it is disturbed. This error means that the camera system will not return to its original position once a displacement occurs and so a blurry image is the result if the displacement is greater than 0.5deg. This error could be a result of several factors: the sample rate of the gyro, the rate at which data is read by the Arduino, round-off error in either or both devices, limitations on servo position resolution imposed by the control code or elsewhere or code oversights in the control program. In the second experiment the real limitations of the stabilization system became fully apparent. The 10cm or 5.7deg in amplitude oscillation observed while the system attempts to stabilize a displacement is already more than 10 times larger than the allowable maximum displacement. There is also the eventual “random walk” displacement observed as the sample rate error described in the paragraph above leads the system to drift to one side. The likely cause of this oscillation is the PI control code and in particular the tuning parameters and program run time. The system is clearly under damped and therefore more tuning and code optimization are required to get the control system to a critically damped state. When the mechanical stabilizer chassis is constructed the gearing ratios of the servo motors will have an effect on the control output and the system will need to be retuned. Beyond tuning and optimizing the control code there is an inherent lag in the sensor system because the gyro rate sensor requires a force to be applied before a change in the angular rate occurs. If an accelerometer had been used instead of a gyro it would read the force as it is applied and eliminate some of the lag in the system. The accelerometer would also have an advantage over the gyro sensor in the vertical direction because an accelerometer would be able to return to the camera to a horizontal by referencing the acceleration due to gravity.     35  6.0  Project Deliverables and Financial Summary  6.1  Deliverables  In our proposal the project team committed to several deliverable which are listed below along with the current status of that deliverable. 1. A functional camera stabilization system suitable for use in a HAB launch. Status: Incomplete, the team will submit its mechanical design plans and drawings and the stabilization system such as it stands now. 2. An Engineering Recommendation Report containing a design record of the stabilization system, relevant operational information, testing results, a discussion of failure modes, and any additional knowledge or work performed toward a full HAB mission. Status: This document. 3. All logbooks used to record the design process and test results. Status: Submitted to the Engineering Project lab. 4. Arduino code used to implement the stabilization algorithm. Status: Submitted along with this report. 5. All circuit diagrams and mechanical drawings relevant to the project. Status: Submitted with is report. 6. Any materials covered by the Project Lab Incidental Cost component of the budget. Status: Will reside with the Project Lab as is.  6.2  Financial Summary  Below is a summary of the project costs listed in Canadian dollars. The project materials were purchased by the project team in October 2010 with the understanding that the team will be reimbursed for these expenses by the Engineering Project Lab. At the conclusion of the project the items reimbursed and listed below will remain in the possession of the Engineering Physics Project. # Description Quantity Vendor Cost per unit Purchased by: To Be Funded by: 1 MLX90609 Gyro Breakout Board 2 Sparkfun $60 Lee Wasilenko Project Lab 2 Arduino Duemilanove 1 Sparkfun $30 Lee Wasilenko Project Lab 3 Servo motors 2 Sparkfun $13 Lee Wasilenko Project Lab  Total Cost   $176 6.3  Ongoing Commitments  36  The project team assumes no ongoing commitments to the project after January 10, 2011. However, the project team would like to see the stabilization completed and a full HAB launch take place so the team is interested in pursuing this in the future but is unwillingly to make a binding commitment at this time.                       37  7.0  Recommendations  The project team has several recommendations that relate to improving and successfully completing the camera stabilization system. 1. Replace the gyro rate sensors with accelerometers. The project team believes that the control system will operate faster and more accurately if the gyro sensors are replaced with accelerometers. Accelerometers also have the ability to provide more information to the system, for example the can provide information about which direction points directly toward the Earth which can be used to level the camera or they can be used to record flight trajectory data. Accelerometers are also in general less expensive than the gyros purchased for this portion of the project. 2. The PI control code should be more rigorously tuned and characterized. The code itself could also likely be optimized for higher speed and accuracy. The system should be tuned for critical dampening. 3. Rewrite the servo control code. The project team does not believe the current servo control code is ideal for accurately controlling the servo position to within the fractions of a degree which is required for a stable image. 4. Perform further research on a suitable mechanical design. The project team described several design iterations in this report but due to resource constraints and circumstance was not able to construct and test a physical prototype of any of these designs. More research needs to be performed before a clearly superior mechanical design emerges. 5. Perform more research into the feasibility of using a spinning gyro to add inherent stability to the platform. It may be entirely possible that the servo system is more complicated and less effective than a spinning gyro stabilized system. 6.  Assemble a student team to complete the other systems necessary for a HAB launch, perform the launch, recover and analyse the data.        38  References  1. Sorrel, Charlie. The $150 Edge-of-Space Camera: MIT Students Beat NASA On Beer-Money Budget. Wired Gadget Lab. [Online] September 15, 2009. [Cited: September 26, 2010.] http://www.wired.com/gadgetlab/2009/09/the-150-space-camera-mit-students-beat-nasa-on-beer- money-budget/. 2. Blake. Noisebridge Successfully Launches and Recovers Spaceship Alpha. Super Happy Fun Time Space Balloon Picture Machine. [Online] [Cited: September 26, 2010.] http://spaceballoonproject.blogspot.com/2010/02/noisebridge-successfully-launches-and.html. 3. meteotek08. Llançament. Flickr Photoalbum. [Online] [Cited: September 26, 2010.] http://www.flickr.com/photos/meteotek08/sets/72157614770919393/. 4. Stillman, David. HAB Launch One. Flickr Photoalbum. [Online] [Cited: September 26, 2010.] http://www.flickr.com/photos/stilldavid/sets/72157624101347600/with/4624712115/. 5. Nathan Seidle. Homepage. Sparkfun Electronics. [Online] [Cited: September 26, 2010.] 6. Atkins, Nolan. Vertical Profile Based on Composition. Chapter 1 - The Earth and it's Atmosphere. [Online] [Cited: September 26, 2010.] http://apollo.lsc.vsc.edu/classes/met130/notes/chapter1/vert_comp.html. 7. Loren, Karl. Atmospheric Structure. Atmospheric Chemistry Data and Resources. [Online] [Cited: September 26, 2010.] http://www.oralchelation.com/clarks/data/p1.htm. 8. (TIGRE), The Interamerican Geospace Research Experiment. Flight Readiness Review Document. Aerospace Balloon Imaging Testing with Accelerometers (ABITA) Experiment. [Online] July 22, 2008. [Cited: September 26, 2010 .] http://laspace.lsu.edu/pacer/Experiment/2008/Documentation/FRR/ABITA_TIGRE_FRR.pdf. 9. Kaymont. Sounding Balloons. [Online] [Cited: September 26, 2010 .] http://www.kaymont.com/pages/sounding-balloons.cfm. 10. Spaceflight, Cambridge University. Trajectory Predictor. CUSF Trajectory Predictor. [Online] [Cited: September 26, 2010.] http://habhub.org/predict/. 11. Wyoming, University of. Balloon Trajectory Forecasts. Department of Atmospheric Science. [Online] [Cited: September 26, 2010.] http://weather.uwyo.edu/polar/balloon_traj.html. 12. Essence’s Model Rocketry Review and Resources . Parachute Size Calculator. Calculators and Tools. [Online] [Cited: September 26, 2010.] http://www.rocketreviews.com/cgi- bin/resources/recoverybox.cgi. 39  13. Stillman, David. High Altitude Weather Balloon Project. davelog. [Online] July 25, 2010. [Cited: September 26, 2010.] http://stilldavid.com/blog/2010/07/high-altitude-weather-balloon-project/. 14. Seidle, Nathan. High Altitude Balloon Launch. Sparkfun. [Online] July 1, 2010. [Cited: September 26, 2010.] http://www.sparkfun.com/commerce/tutorial_info.php?tutorials_id=180. 15. Canon Hack Development Kit. CHDK - Unleash the POWER in your Canon PowerShot! . CHDK Wiki. [Online] [Cited: September 26, 2010.] http://chdk.wikia.com/wiki/CHDK. 16. Astrometry.net. Astrometry Group Pool. Flickr. [Online] http://www.flickr.com/groups/astrometry/pool/with/4904953574/. 17. rc airplanes. RC Helicopter Camera Stabilizer. aerofunmen. [Online] [Cited: September 26, 2010.] http://aerofunmen.blogspot.com/2010/07/rc-helicopter-camera-stabilizer.html. 18. Princeton University. Camera stabilization senior thesis . YouTube. [Online] [Cited: September 26, 2010.] http://www.youtube.com/watch?v=YP-YWk3PvR8.

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.52966.1-0074456/manifest

Comment

Related Items