UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Automated systems to assess weights and activity in group housed mice Noorshams, Omid 2017

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2018_february_noorshams_omid.pdf [ 5.49MB ]
Metadata
JSON: 24-1.0362878.json
JSON-LD: 24-1.0362878-ld.json
RDF/XML (Pretty): 24-1.0362878-rdf.xml
RDF/JSON: 24-1.0362878-rdf.json
Turtle: 24-1.0362878-turtle.txt
N-Triples: 24-1.0362878-rdf-ntriples.txt
Original Record: 24-1.0362878-source.json
Full Text
24-1.0362878-fulltext.txt
Citation
24-1.0362878.ris

Full Text

Automated systems to assess weights and activity in grouphoused micebyOmid NoorshamsA THESIS SUBMITTED IN PARTIAL FULFILLMENTOF THE REQUIREMENTS FOR THE DEGREE OFMaster of ScienceinTHE FACULTY OF GRADUATE AND POSTDOCTORALSTUDIES(Neuroscience)The University of British Columbia(Vancouver)December 2017c© Omid Noorshams, 2017AbstractIn order to investigate central nervous system disorders such as Parkinson’sdisease and Alzheimer disease, genetically altered mice are used. These mousemodels should be frequently monitored to assess disease progression. Most mon-itoring system for activity and weighing require remove of the animals from theirhomecage and place them in a novel environment. During these procedures animalsmay undergo the stress which can alter the behavioral and social phenotypes. Wedeveloped two systems to automatically weigh and track mice in their homecage.We have developed an automatic weighing system that facilitates these pro-cedures within the mouse homecage. In this system up to 10 mice freely movebetween two cages (28x18x9 cm) which were connected by a weighing chambermounted on a load cell. Each mouse is identified using an RFID tag placed un-der the skin of the neck. In one cage we placed a bottle of water and in the othercage food. A single-board computer (Raspberry Pi; RPi) controls the task, loggingRFID tag, load cell weights, and time stamps from each RFID detection until theanimal leaves the chamber. Collected data were statistically analyzed to estimateeach mouse weight. We anticipate integration with tasks where automated imagingor behaviour is assessed in homecages. This automated system permits weighingmice several times per day for a long period of time without disruption by humaninteraction.iiIn the automated tracking system, in addition to a grid of 18 RFID readersand 18 microcontrollers, one Raspberry Pi, as the master micro-computer, wasalso used. While the animals were moving inside the home cage, the printed boardcircuit on which the RFID readers were located monitored the animal’s locationand activity. One RFID reader was on at the time for 120ms and off for almost2.3s. Therefore, the mean frequency of reading can be around 1Hz. In this system,we acquired data for one mouse and two mice in the home cage. In order to testthe accuracy of the system, we also monitored their activity with a camera. Thissystem allowed us to monitor mice activity in their homecage for a long period oftime.iiiLay SummaryWeve designed and implemented two automated systems which facilitatethe data acquisition and eliminate the need of human-animal interaction. The firstsystem is about automatically weighing mice in their homecage. This system al-lows us to weigh mice several times per day without any human interference whichcan cause anxiety in animals. The second system is about automatically trackingmice in their homecage with the RFID technology. This system also allows us totrack mice without any human-animal interaction for a long period of time. Au-tomated Tracking System can monitor mice activity and their behavior in theirhomecage.ivPrefaceThe first part of this dissertation, related to Automated weighing System, was pub-lished in the Journal of Neuroscience Methods on May 2017 by Omid Noorshams,Tim Murphy, and Jaime Boyd [30]. Dr Tim Murphy supervised the projects andworked with me to write the paper that has been mentioned. Jaime Boyd helpedme to developed the codes for automated weighing system. I am responsible formaking the systems, figures, scripts, and codes. Automated Tracking System PCBwas designed by me and Pawel.vTable of ContentsAbstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiLay Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ivPreface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vTable of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viList of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viiiList of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ixAcknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xii1 Introduction To Automated Weighing and Tracking System for GroupHoused Mice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 Automated Weighing System Methods . . . . . . . . . . . . . . . . . 63 Automated Weighing System Results . . . . . . . . . . . . . . . . . . 104 Discussion and Conclusion on Automated Weighing System . . . . 15vi5 Automated Tracking System Methods . . . . . . . . . . . . . . . . . 165.1 Animals and Husbandry . . . . . . . . . . . . . . . . . . . . . . 165.2 RFID Tag Injection . . . . . . . . . . . . . . . . . . . . . . . . . 165.3 Printed Circuit Board . . . . . . . . . . . . . . . . . . . . . . . . 175.4 Video-based recording system . . . . . . . . . . . . . . . . . . . 195.4.1 Computer Vision Technique . . . . . . . . . . . . . . . . 196 Automated Tracking System Results . . . . . . . . . . . . . . . . . . 217 Discussion and Conclusion on Automated Tracking System . . . . . 27Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30A Automated Tracking System Codes . . . . . . . . . . . . . . . . . . 36B Automated Tracking System Schematic . . . . . . . . . . . . . . . . 56viiList of TablesTable 2.1 Parts List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9viiiList of FiguresFigure 2.1 Dual cage automatic weighing system for group housed mice.(a) 3D view of automatic weighing system, designed with Blender.(b) 5 main parts of hardware in the automated weighing sys-tem, designed with Blender. . . . . . . . . . . . . . . . . . . 7Figure 3.1 Calibration of the load cell with 34 standard weights. Theslope of this line (slope = 0.01399e6) which is fitted to loadcell outputs, represents for relationship between decimal val-ues of load cell (24-bit output) and standard weights. . . . . . 11Figure 3.2 Set of epochs for 12 hours (contains 102 epochs) for one mousecrossing the chamber. Shows the load cell values (g). . . . . . 12Figure 3.3 The estimating method based on extracting the most commonvalue during 12 hours of activity for one mouse. The set ofepochs were merged together and binned by 0.2g increment.The actual weight was 27.6 g which was equal to estimatedweight in this case. . . . . . . . . . . . . . . . . . . . . . . . 12Figure 3.4 Comparing actual weights with estimated weights for 5 mice.Mice were manually weighed 7 times with a digital scale (r=0.90and p=0.0001). . . . . . . . . . . . . . . . . . . . . . . . . . 13ixFigure 3.5 Long term monitoring of weights in 5 group housed mice.(a) Depicts the activity patterns of mice for 53 days duringeach day. Each tick mark indicates a weighing tube entry. (b)Weight fluctuation over 53 days for 5 continuously tracked mice. 14Figure 5.1 3D view of Automated Tracking System, designed with Blender. 18Figure 6.1 Top down view of baseplate, recorded with a Picamera atthe top of the homecage. (a) The horizontal and vertical blackline show the boundary of RFID readers. Blue lines and pur-ple curves illustrate monitoring of with RFID readers and com-puter vision technique, respectively. 65 is the number of RFIDtag and the green box shows an approximation of the mousearea. (b) Illustrates one sample frame after subtracting frombackground frame. (c) To binarizing the frames after back-ground subtraction, the values greater than 90 to 255 are as-signed to be 0 and less than 60 are assigned to be 1. This whitecontour shows the approximated location of the mouse for oneframe. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Figure 6.2 Comparing the traveled distance using computer vision tech-niques and Automated Tracking System. Stars show the dis-tance traveled by the mouse with a bin of 3sec. The orangeline shows the equality of two methods. The black dashed lineillustrates the linear regression of the system measurement.R value for the regression is 0.997 and the slope of the blackline is 1.25. . . . . . . . . . . . . . . . . . . . . . . . . . . . 24xFigure 6.3 Frequency of reading by Automated RFID Tracking Sys-tem. Frequency can vary depending on animal activity andnumber within the homecage. Here, one mouse and twomice were monitored for 16min. a) For one mouse in thehomecage, maximum frequency of reading is 6.25, mean fre-quency is 1.27, and also minimum frequency is 0.06. b) Fortwo mice in the home cage max frequency is 6.24, mean fre-quency is 0.95, and also minimum frequency is 0.08 . . . . . 25Figure 6.4 Contours of activity for the animal during 16min in thehomecage. (a) Activity monitored by Automated TrackingSystem indicates that most activity is at the corners. (b) Ac-tivity monitored by the Computer Vision technique. It showsthe major activity is at the left and right side of the home cage. 26xiAcknowledgmentsI would like to thank my advisor Prof. Timothy Murphy for the continuous supportof my research, for his patience, motivation, enthusiasm. Also I would like to thankJamie Boyd for his guidance which helped me in all the time of research.xiiChapter 1Introduction To AutomatedWeighing and Tracking Systemfor Group Housed MiceThe alterations of physiology and behavior of home-cage animals such asrodents have recently been the recent interest of scientists. Topics include: inves-tigating perturbations in neural circuit function, disease detection, food consump-tion, and activity parameters. On one side, the importance of acquiring a deepunderstanding of home-cage animals physiology and behavior, and on the otherside inherent limitations of manual assessment, such as cost, time consumption,and reproducibility necessitate automation of systems to study on animals. Auto-mated mouse home cage systems have been widely developed in the recent decade,facilitating the monitoring of mouse behaviors, physiology, brain activity, and bodyconditions for extended periods [19]In addition, assessing an animal’s health status and well-being is neces-sary for research studies when anorexia, wasting, dehydration, and death can bepotential complications. In most studies, scientists and veterinarians implementsimple and non-invasive methods to monitor body weight, behavioral parameters,and physical appearance for the health status and well-being evaluation. Thereare several noninvasive and simple examinations such as monitoring responses toexternal stimuli, poor coat condition, nasal or ocular discharge, and depressed ap-1petite due to dehydration and weight loss [20]. Any major deviations from normalcan be assumed as a health problem. Measuring body weight (BW) is one of theimportant parameters in body condition scoring, which is used to evaluate the an-imals health, establish endpoints, side effects of drugs, and implement water/foodrestriction protocols. Loss of BW, which is a result of decreased food and waterconsumption, is a sign of poor health status. Around one week of constantly losingweight in tumor-bearing rats can be considered as an irreversible progression ofdeath. Also, comparing gallstone-bearing mice and healthy controls BW show thatBW deviations are significant for males but not for females. Although behaviorand physical appearance do not indicate any abnormalities, losing twenty percentof BW is considered as a criterion of euthanasia [20, 43].Water/food restriction is one way to motivate animals to cooperate in be-havioral experiments. In this procedure, the weight of mouse needs to be carefullymonitored[8, 15, 42]. In order to implement a water/food restriction, first ad libi-tum values should be assessed. Then, for the food regulation, rodents are fed 70%of ad libitum intake until BW reaches 90% to 80% of control BW. Acute waterdeprivation for more than 24 hours can cause serious clinical dehydration and it isnot allowed [42]. Rodents under the food/water restriction must be weighed reg-ularly, at least twice per week and their behavior and health status must also befrequently monitored [33]. However, more frequent of weighing animals is highlyencouraged. Although 10% to 20% reduction in BW is assumed to be safe for wa-ter/food restriction protocols implementation, different weight reduction protocolscan have an effect on stress and motivation of animals, consequently replicatingand comparing results between studies may become difficult. Especially, when thesubtle behavioral differences are investigated, the degree of motivation, which is aresult of the food/water regulation, will be more critical.Frequent monitoring of BW is also important in psychotropic medicationadministration studies. When patients have been exposed to psychotropic medica-tions for a long period of time, gaining weight is one likely challenge that willbe observed [5, 18, 27]. The frequency and rate of weight-gain in psychiatricpatients depend on the type of medication, for instance, olanzapine administra-tion can cause 2.3 kg increase per month in humans. However, risperidone onlybrings 1 kg per month[46]. Increasing in the appetite is a result of the interaction2of antipsychotic drugs 2 with neuronal receptors to dopamine, serotonin and his-tamine. On the other side, the effects of drugs-induced hyperprolactinaemia ongonadal-adrenal steroids and insulin may result in metabolic-endocrine disruptionof weight regulation [46]. Moreover, some antidepressants [9] and mood stabiliz-ers [16] may cause a serious obesity problem, which has been acknowledged asan individual and public health issue. Obesity may lead the patients to developother physical problems, such as thyroid abnormalities [13, 25]. Hence, monitor-ing patients’ or animals’ in a study physiological and behavioral status during theintake of antidepressants, mood stabilizers, and anti-psychotic administration is animportant part of a related study.As previously mentioned, the importance of monitoring animals physicaland behavioral status during a study is apparent to scientists and biomedical re-searchers. However, based on the type of animals which can be rodents, cats, andprimates, and also the aims of study, the frequency of monitoring can vary fromseveral times per week to several times per day. In small animals, such as rodents,this frequency can be higher. Therefore, monitoring laboratory mouse behavior,rather than animals like non-human primates, is more important for biomedicalstudies. Although we need to frequently record the status of animals; invasive han-dling animals such as mice can even alter plasma concentration of corticosteroid,glucose, growth hormones, heart rate, blood pressure, and behavior. These alter-ations, thus, may impair the reliability of responses. For example, picking up micefrom the tail is aversive and stressful [7]. However, stress can be reduced by gentlyhandling them such as using a tunnel or cupping them in your hand; laboratory rou-tines such as personnel entering the animal housing room, moving cages, cleaning,collecting blood and tissues, and weighing animals can still cause pain and stress.Intense stress for animals induced by laboratory routines is a major concern forInstitutional Animal Care and Use Committees (IACUC) and scientists. Whenanimals have been frequently exposed to stressful situations, their biological equi-librium may alter, hence, maladaptive behaviors will be problematic. Automatinghome-cage systems reduces the need for human interaction, and thus significantlydecreases stress[7]. On the other side, manual weighing, water/food intake record-ing and monitoring animals behavior can be highly time consuming for researchers.Automated systems can reduce unnecessary time consumption researchers.3We have developed a system to automatically weigh group-housed mice[30].This system allows researchers to monitor the fluctuation of body weight severaltimes per day, without any interaction, and facilitates accurate record keeping, pro-tocol compliance, and integration with tasks where automated imaging [29] or be-havior is assessed in home cages[4, 33]. The details of this system can be found inChapter 2, 3, and 4.The social interaction and communication in laboratory animals are veryimportant. Rodents are one example of laboratory animals which manifest thesebasic features. The monitoring of social interaction and communication betweenanimals have had provided huge contributions to psychology, ecology, genetics andneuroscience [37, 38, 47]. Besides, these features have been revealed as importantin many neuropsychiatric disorders understanding and treatment, such as autismand schizophrenia [31, 37, 41]. Tracking the animals in a natural habitat can bevery difficult since they need to be monitored with a high temporal and spatial res-olution. Recording their behavior and manually quantifying their activity may betime-consuming tasks; also, it may be susceptible to errors and subjectivity, andpotentially overlook the most obvious phenotypes [6, 28, 45]. In order to over-come these constraints, automated tracking and behavioral phenotyping technolo-gies have been developed with which we are able to monitor the animals in theirnatural or semi natural habitats, without any human interactions. Some of thesesystems are based on computer vision and machine learning techniques, which re-quire a thousands of pictures of freely behaving mice for model training behavioralannotation; mouse behaviors can be characterized, using machine learning meth-ods, and they may be able to label other test videos and pictures. So for thesestudies have only discussed the behavioral phenotypes of a single individual in itshomecage [23, 39, 48, 49]. The same strategy was implemented for a large group ofanimals [11, 12, 17, 24, 40]. Machine learning and computer vision techniques canbe fruitful for large species like humans, but there are some limitations in smalleranimals like rodents and fishes which are very similar in color, size, and behaviors.Another technique allows us to track the position of single individual in a largegroup by using Radio-frequency identified (RFID) tagging [3, 14, 26, 34]. Neu-rodegenerative disorders such as Alzheimer’s disease (AD) have been subjected inthis study [26], to evaluate preferred characterization of mice carrying a genetic4disposition to develop AD-like pathology in their semi-natural habitats. In thisstudy, RFID tags and a network of antennas were used to label each of the 4 miceand track a group of mice during a long period of time.Although RFID technique facilitates tracking of animals for a long periodof time in a large group, because of low temporal and spatial resolution it cannotbe used to detect their social behaviors. The spatial accuracy is reportedly around20- 40 cm. Also, when the mice move faster than 110cm/sec, the system cannotdetect the animals [1, 21, 35]. The computer vision and RFID technique can be im-plemented together as a hybrid system[45]. In this study, 10 mice were located ina homecage and tracked by RFIDs technique. RFID was time-synchronized withthe visual detection, X and Y coordination, velocities, and orientation, and wasused to show social and individual behaviors. They also monitored the formationand stability of dominant subordinate hierarchy in the group of mice. A grid ofRFID antennas (quantity of 39) with 11cm in diameter horizontally located underthe semi-natural homecage and also one CCD video camera with 30fps was posi-tioned on the top of the cage. By combining these two techniques, they were ableto increase the special and temporal resolution, 0.5cm and 30ms, respectively.Although studying mice in semi-natural environments can be very helpfulin understanding of mice behaviors in the nature, we also need to provide an in-sight into how they act and behave in their original homecage [3]. ]. In this study,they developed a system consist of a grid of 18 RFIDs (in a 3x6 array) to gainunderstandings about mice activities of commonly used inbred lines. This sys-tem allowed them to continuously monitor the animals thus the human interactionwas eliminated. This system is specifically useful for studying progressive motorimpairment models such as neurodegenerative diseases and also social interactionimpairment models such as autism, within modern high density individually ven-tilated caging (IVC). This system also has a low temporal and spatial resolution(max frequency of 2Hz and 5cm, respectively).5Chapter 2Automated Weighing SystemMethodsAll procedures were approved by the University of British Columbia Ani-mal Care Committee and conformed to the Canadian Council on Animal Care andUse guidelines and employed C57BL6 mice. Two standard mouse housing cages of28×18×12 cm capable of holding up to 10 mice were connected at 30×30 mmopenings with a non-contacting floating chamber that was resting on a load cell.The chambers 100mm long with a square cross-section (30mm per side). Thischamber is mounted on a 100g load cell (BONAD) which is connected to a break-out containing an AVIA Semiconductor HX711 load cell amplifier (Sparkfun) canbe found in table2.1. RFID-tags (inert glass-encapsulated tags 125 kHz, Sparkfun)were injected into the nape of the mouse neck [10] and detected by a RFID reader(ID-12LA RFID SparkFun) that was held by a fixture on the top of the chamberwith a 3mm gap. The gap prevents the chamber from touching the RFID reader(that could bias the weight). Animals freely moved between the two cages. Oncethe RFID-tag was detected the load cell would start recording at 10 samples persecond until the RFID reader signaled that the tag was no longer in range and thatthe measured weight had dropped below an arbitrary minimum threshold load celland RFID reader were connected to a Raspberry Pi for data acquisition. Entry andexit RFID-tags times were also monitored and considered as epochs. The 3D viewof the system is depicted in Figure2.1.6(a)(b)Figure 2.1: Dual cage automatic weighing system for group housed mice.(a) 3D view of automatic weighing system, designed with Blender. (b)5 main parts of hardware in the automated weighing system, designedwith Blender.7Since the average of adult female and male mice weights are 22 g and 28 grespectively [32] and weight of chamber is 20 g (which was set as a tare at the be-ginning of each trial), we used a load cell with the maximum capacity of 100 g andprecision of 0.1 g. This level of precision is consistent with weight fluctuationsin a day which can be around 0.2 g. The load cell was calibrated with 34 stan-dard weights (0-12.98 g increments of 0.37 g). We applied standard weights fromthe lowest to highest and collected outputs at 10 Hz for 5 s (50 samples for eachweight). We averaged outputs for each weight and then calculated the standarddeviation. The standard deviation of outputs was less than 0.002%. By fitting aline on the data and estimating the slope (0.01399e6; r=0.99 and p=0.0001) of thisline we could convert the output signals of load cell to grams. This line representsfor relationship between decimal values of load cell (24-bit output) and standardweights. As it was previously mentioned, an HX711 load cell amplifier was usedto amplify signals received from strain gauges in the load cell and then communi-cate with the Raspberry Pi. Five wires (positive and negative signals: white andgreen, power: red, ground: black, and electromagnetic shield: yellow) from thestrain gauge are connected to labeled inputs on the HX711 breakout. Vcc andGND on the breakout are connected to 5V and ground on Raspberry Pi, respec-tively. The HX711 includes a built-in 24-bit digital to analog converter (DAC).Data are output serially on a single pin (DAT), at a rate set by the signal on theclock pin (SCK). DAT and SCK can be connected to any GPIO on Raspberry Pi.The related Python code for communication between HX711 and Raspberry Pi andRFID reader, python code for the data analysis can be found in the Appendix alongwith an electronics schematic.Once the animal went inside the chamber and the RFID-tag was read bythe RFID reader the Raspberry Pi would start recording data with frequency of 10Hz for each epoch. Vcc and GND pins of RFID were connected to 5V and groundof Raspberry Pi, respectively. Moreover D0 and TIR (Tag in the range) pins wereconnected to the serial port (RXD) and any other GPIO on the RPi2, respectively.RXD is a pin on RPi through which the data can be transferred. Once a RFID-tag islocated in the range of RFID, TIR becomes high and the serial number of RFID-tagis read by RPi. The TIR stays high until the tag is out of range of the reader. Whilethe TIR pin is high, the RFID reader cannot read another tag, even if another tag is8Table 2.1: Parts ListComponent Supplier Part NumberRaspberry Pi Model B2 Newark 38Y646732Gb SD card Adafruit 1583Pi Cobbler+ Adafruit 2028RFID reader ID-12LA (125 kHz) Sparkfun 11827RFID reader breakout Sparkfun 13030RFID-tag glass capsule (125 kHz) Sparkfun 08310100g load cell BONAD CZL639MLoad Cell Amplifier HX711 Breakout Sparkfun 138797” Touchscreen display Raspberry Pi Element14 49Y1712in range. Therefore, only one animal can be detected inside the chamber until thatanimal moves out of the RFID range.During these experiments paper towels were used instead of beddings. Icleaned the cages every other day for 53 days. Although two mice could fit insidethe camber, it rarely happened and also could be easily detected. Two mice weightwould be more than 50 grams which we knew that none of those mouse were thatheavy.9Chapter 3Automated Weighing SystemResultsFemale mice C57bl6 mice that were 12 months of age and had transcranialbrain windows [36] and head-fixation bars [29] were used for data acquisition.While the animals were equipped for optogenetics, they were tested without head-fixation to assess feasibility for future applications. The 5 mice were placed in thedual cage automatic weighing system continuously for 53 days. Although we haveassessed 5 animals here, we have recently been approved for housing 10 mice inthe double cage configuration. Preliminary work with 10 male mice indicates littleissue with crowding or fighting (over standard caging) consistent with recent workin connected cages with up to 50 mice.Output of a load cell is a 24-bit value that needs to be converted to a weightvalue. In order to convert this 24-bit signal to weight value we collected outputsignals from standard weights. We collected the baseline of the load cell, meaninga 24-bit signal without adding weights on the chamber. Then we applied standardweights (0-12.98 g increments of 0.37 g) from the lowest to highest and collectedoutputs at 10 Hz for 5 s (50 samples for each weights). We averaged outputs foreach weight and then calculated the standard deviation. The standard deviationof outputs was less than 0.002%. Therefore fluctuations of outputs in a stablesituation of load cell were negligible. In a graph of the average of 24-bit outputsand corresponding weights (Figure3.1) we can see the linearity of load cell and10Figure 3.1: Calibration of the load cell with 34 standard weights. The slopeof this line (slope = 0.01399e6) which is fitted to load cell outputs, repre-sents for relationship between decimal values of load cell (24-bit output)and standard weights.also the proportion of gram to output values. By fitting a line on the data andestimating the slope of this line we could convert the output signals of load cell tograms (Figure3.1).Mice moved freely between the two cages and had continuous access tofood and water. During this period we recorded epochs, which corresponded to thetime a mouse goes inside the chamber and load cell starts recording until the mousegoes out of RFID reading range. The pattern of epochs for one mouse over 12 h isshown in Figure 3.2 with 80% of epochs less 1.5s in duration. We can see that atthe onset of each epoch values increase until the maximum. The maximum valueis not reliable to estimate weight since other animals may simultaneously touchthe chamber. In order to estimate weight, we considered a set of epochs takentogether (for example, epochs within 6 h) and found the most common weight withan increment of 0.2 g in a histogram diagram.In order to estimate weight we needed to exclude out of range points. Thenusing hist and sort functions in Matlab, the weights in the set of epochs were sortedin a descending fashion. Then by applying stair function in Matlab we were ableto find the most common weights in a specific increment (0.2g). In Figure 3.3 wecan see one example of this estimation. As we expected, the distribution of valuesaround the actual weight (27.6 g in this case) forms a clear peak. To assess the11Figure 3.2: Set of epochs for 12 hours (contains 102 epochs) for one mousecrossing the chamber. Shows the load cell values (g).Figure 3.3: The estimating method based on extracting the most commonvalue during 12 hours of activity for one mouse. The set of epochswere merged together and binned by 0.2g increment. The actual weightwas 27.6 g which was equal to estimated weight in this case.accuracy of this estimation method, we compare the estimated weights with actualmouse weights which we manually measured with a digital scale. Two hour sets ofepochs before and after manual weighing were used to evaluate automated weightaccuracy (Figure 3.4).Comparison of software estimated weights and actual ones is shown inFigure3.4. Figure 3.4 indicates that this method can predict the weights with anaverage error of 1.6%. Moreover, circadian cycles were monitored by assessingweighing tube entries in Figure 3.5(a) for 53 days. On average mice were weighed12Figure 3.4: Comparing actual weights with estimated weights for 5 mice.Mice were manually weighed 7 times with a digital scale (r=0.90 andp=0.0001).42+-16 times a day and there were no days when mice were unable to be weighed.We also estimated weights for over the 53 days and the analysis showed periodicpatterns of weight fluctuation (Figure 3.5(b)). Interestingly, we observed correla-tions in weight between selected animal pairs over the 53 day period (Figure 3.5).13(a)(b)Figure 3.5: Long term monitoring of weights in 5 group housed mice. (a)Depicts the activity patterns of mice for 53 days during each day. Eachtick mark indicates a weighing tube entry. (b) Weight fluctuation over53 days for 5 continuously tracked mice.14Chapter 4Discussion and Conclusion onAutomated Weighing SystemSeveral systems have been previously developed to assess water and foodintake in rats and mice and automatically weigh animals. Investigators measuredthe weight of water and food that an animal would take in each bout [44] attachingload cells to the water and food hoppers outside of the cage. Another automatedhome cage was developed to monitor mouse weight [22], but only allowed a singlemouse to be tracked.Moreover TSE systems has made an automated body weight measurementdevice which can be located inside a mouse/rat home cage (TSE systems). Thisdevice is a tube attached to a scale that is located outside of the cage. Once theanimal stands inside the tube, weights would be monitored, again without groupstatistics. The advantage of our system over the previous work is that we can nowidentify and follow individual mice within a group rather than only mouse [22](URL link was retrieved date: 20/01/ 2017) by applying RFID readers and tags.The system is modular in nature and can be the basis of more complex future taskswhere homecage behavior or brain activity is assessed.We have developed this automated system to allow us to weigh mice severaltimes per day for a long period of time without disruption by human interaction.The system employs inexpensive hardware open-source code that we anticipatewill promote uptake by the community.15Chapter 5Automated Tracking SystemMethods5.1 Animals and HusbandryMale mice from C57Bl/6J inbred strain, bred in the Animal Resources Unitat University of British Columbia. The mice were kept under controlled light,7AMto 7PM was the light period and 7PM to 7AM was the dark period. Also, temper-ature (around 21) and humidity 60% conditions were under controlled. Animalshad free access to water and food in their original homecage. All the proceduresand animal studies were carried out in accordance with the University of BritishColumbia Animal Care Committee and conformed to the Canadian Council onAnimal Care and Use guidelines. Trials were carried out with 1, 2, or 3 mice in astandard homecage similar to their original one, located on the top of the system.After each trial, animals were brought back to their original homecage. Animalwelfare checks were carried out once a day visually.5.2 RFID Tag InjectionRFID tags were subcutaneously injected into the lower right or left abdomenof mice. These tags are small glassy capsule (11.2x2mm) which are biocompati-ble. There is a microchip inside of each glass with a unique ID. Prior to the inject-16ing procedure, mice were briefly anesthetized with isoflurane within an inductionchamber. In order to maintain the anesthesia, mice were exposed to isoflurane afterbecoming unconscious. The lower abdomen was disinfected with betadine. Thetag located inside a needle and then injected in the lower right or left abdomen.5.3 Printed Circuit BoardIn order to track mice in their homecage, I designed a Printed Circuit Board(PCB) with a grid of 18 RFID readers (array of 3X6). The gaps between readers are3mm and each readers (ID20LA) have a size of 4.9x4.8 cm, which can cover a stan-dard homecage with the bottom area of 28x18cm. Since there might be more thanone mice in the cage and also more than one reading from RFID readers, we needto assign each reader a specific address or ID. Therefore, each reader is connectedto one micro-controller (ProTrinket) to make the Reset pin of reader high and pre-pare it for any potential read of the RFID tag in the reading range. We cannot turnthe reader power pin on and off, because one electronic part of RFID should bealways on to be able to respond to tags as fast as possible. Plus, each reader canreact to the tags only once in a period of being high. When a mouse is injectedwith a RFID tag inside their body and located on the top of the reader which hasbeen turned on for 120ms, data can be read and then sent to the ProTrinket RXD(RXD is the pin to receive data in series form). Afterward that ProTrinket sendsdata to a master micro-controller (Raspberry Pi) via I2C connections. RaspberryPi turns only one ProTrinket at the moment via I2C, turns it off after 120ms and soon. Each ProTrinket is programmed with Arduino language. This program allowsProTrinkets to access a specific I2C address of Raspberry Pi through which datacan be sent. One challenge in using the Radio-frequency Identified (RFID) readersis the phenomenon called RFID reader collision. This phenomenon occurs whenthe coverage areas of two or more readers overlap with each other, which can leadto two problems: 1) signal interference 2) multiple reads of the same tag. Theseproblems can be resolved in two ways which should be implemented together togain the best result. Two RFID readers (ID20LA) depending on their size, must notbe on simultaneously when located less than 16 cm apart, otherwise none of themcan read tags. Therefore, we need to turn on one reader at the time for a certain17period of time. This time duration depends on the type of readers. Since we usedID20LA readers they need to be on at least for 120ms. Readers would not be ableto power up the passive tags and receive the data from them in less than that amountof time. Although only one reader is turned on at a particular moment, antennascan have crosstalk between them each though on the turned-off antennas. In orderto overcome this problem, we need to use aluminum shields wrapped around thereaders. With these two solutions, we can safely turn on one reader and read thedata for 120ms and so on. The strength of electromagnetic field around the readersdepends on the size of reader.Figure 5.1: 3D view of Automated Tracking System, designed with Blender.Using a smaller reader such as ID12LA would lead to a higher spatial res-olution; however smaller readers make weaker electromagnetic field. Therefore,there is tradeoff between the signal strength and spatial resolution. Since a plasticcage with the thickness of 3mm is on the top of the grid of RFID readers and alsothe RFID tags have been injected into the mice lower part of abdomen; the electro-18magnetic field should be strong enough to read the tags from a distance of at least3cm. Otherwise, readers might miss the tags frequently. The printed circuit boardis fixed in a plastic box and the cage is also fixed at the top of it. As we said before,readers send the data via I2C to a master micro-controller; here we used RaspberryPi. Data contain mice IDs, XY coordinates of each mouse, and recording time, lineby line. Data are extracted and stored in a text file for further analysis. In order torecord the mice activity with a higher resolution, a camera is attached to a series ofaluminum posts and fixed downwardly. The 3D view of this system was designedwith Blender, can be seen in Figure 5.1.5.4 Video-based recording systemAs previously mentioned, a Pi-camera, with a recording rate of 30frames persecond, is mounted at the top of the cage with a distance of 20cm, exactly abovethe center of the enclosure region. The camera is connected to a Raspberry Pi. Thiscamera starts recording once the RFID readers start reading. While readers sendthe data to the ProTrinkets and then ProTrinkets send the data to the Raspberry Pivia I2C connections, Raspberry Pi saves the data in a text file; Raspberry Pi alsosaves the camera recording through .h246 files. it is worth mentioning that whitetapes have been used on the bottom of the cage to make the color contrast moreintense. At the end of the each trial we need to analyze camera recording withcomputer vision techniques. C57Bl/6J mice are black and the bottom of the cageis white, thus it is easier to detect the mice positions. In order to acquire moreaccurate results from the camera, firstly we record the cage with out mice. Thefirst photo can be used as a background to calibrate each frame.5.4.1 Computer Vision TechniqueHerein, Background Subtraction or Foreground Detection technique is im-plemented. Background Subtraction is a technique in the field of image processingand computer vision in which photos or each frame of video are subtracted from abaseline frame or foreground to be used for object detection. After subtracting theforeground from the frames, we need to implement several steps of image process-ing such as, noise removing, morphology. Since the camera is fixed and we can19have the clear background frame as a reference, that is, the cage without presenceof mice; this technique can be very useful for detecting the moving objects. All ofthe post-processing codes have been written in Python 3.5. They are available inthe appendices.In order to analyze the videos and detect the mice, firstly we need to re-size the frames and the reference frame with a function called resize in imutilslibrary (https://github.com/jrosebr1/imutils). Then, we need to convert the framesand reference to gray color from BRG, using cv2.cvtColor function. To eliminatethe intensity of noise, function of cv2.GaussianBlur is helpful. Using cv2.absdiffwe subtract the reference frame from other frames, after which the function ofcv2.threshold gives us the binary format of each frame in which objects can be de-tectable. By implementing morphology methods such as cv2.dilate and cv2.erode,we can also suppress the amount of noise and partially remove small objects (bulbsor contours) which are not our aim to detect like mice. Thereafter, with the func-tion of cv2.findContours, we can find the big contours which represent our objects.It also gives us the center of the contours.20Chapter 6Automated Tracking SystemResultsThe RFID Grid Tracking System is built to analyze the social and individualbehavior of group of mice in a standard homecage. In this system, as we describedin detail in the last Chapter 5, a grid of 18 RFID readers monitor the coordinationof each mice in the homecage and sends the data to a ProTrinkets micro-controllerand a Raspberry Pi. A camera is also used to validate the accuracy of RFID gridtracking system results.Due to the physical limitations which are spatial and temporal resolutioncannot be higher at specific values. ID20LA RFID readers need to be powered upfor at least 120ms to be able to send the data to the micro-controllers. Only onereader can be powered up at a time in a relatively small area such as the bottomarea of the homecage. Therefore, for 18 RFID readers almost 2s takes to activateand read all of them. As a matter of fact, in this system, the temporal resolution isaround 0.5Hz. Also, if the animal moves quickly in the cage, despite being close tothe on the RFID reader, it may not be able to read it; RFID needs at least 120ms tomonitor the accurate data from the tag. Plus, if two animals stand very close to eachother on the top of the on the RFID reader, none of them will be read because ofinterference only one tag can be read at the time. Combing all these factors, showsthat the real temporal resolution is always less or equal to the nominal resolutionof each RFID reader.21Spatial resolution also depends on the size of the RFID readers which forID20LA can be around 5cm. There are several smaller RFID readers which mightincrease the spatial resolution, however the range of reading would decrease dueto the smaller internal antennas. Thus, because of this trade off, we decided to useID20LA to have the larger area of reading. In this case each RFID reader tells uswhere the RFID tag is which has been implanted in each mouse, in a range of 5cmevery 2s. For example, if the mouse is situated close to the wall, the readers will bereporting that the animal is at the center of the reader in which the maximum errorcan be 2.5cm. Due to the size of the animal, around 4cm in length, this spatial errormay be reasonable.Figure6.1 shows an example of how camera and RFID readers monitor theactivity of one mouse. The blue lines illustrate the RFID reading. The centerof each reader reports one reading connected to the previous one, which, in turn,reports the last reading. Also, the purple curves represent the series of the centerof contours, which are estimations of the mouse positions in every frame. At thebottom of the homecage, white tapes are used to show the black mouse clearer.The black horizontal and vertical line show estimations of RFID readers boundary.In order to evaluate the RFID Grid Tracking System efficiency, we needto compare it with computer vision results. One way to compare them is to esti-mate the distance of traveling by animals in several time spans. Figure 6.2 showsthe traveling distance of one mouse estimated by this system (stars) and computervision (orange line) on axis X and Y, respectively. In order to record the animalactivity by a camera, we needed to remove the lid of the homecage in which theanimals are usually very active, and might cause underestimation of the trackingsystem due to the limited time resolution. However, the graph shows a good cor-relation between the computer vision technique estimation and automated trackingsystem results.Based on the activity of animals in the cage the frequency of reading candiffer. Monitoring one mouse for 16min, shows that the maximum frequency caneven reach to 6.25; however, the mean value of frequency is 1.27. Figure 6.3illustrates the frequency of the 16min recording for this animal. Also, in this figure,we can see the difference between two cases of monitoring, one animal and twoanimals. The frequency of monitoring for two animals is generally less than for22(a)(b) (c)Figure 6.1: Top down view of baseplate, recorded with a Picamera at thetop of the homecage. (a) The horizontal and vertical black line showthe boundary of RFID readers. Blue lines and purple curves illustratemonitoring of with RFID readers and computer vision technique, re-spectively. 65 is the number of RFID tag and the green box shows anapproximation of the mouse area. (b) Illustrates one sample frame aftersubtracting from background frame. (c) To binarizing the frames afterbackground subtraction, the values greater than 90 to 255 are assignedto be 0 and less than 60 are assigned to be 1. This white contour showsthe approximated location of the mouse for one frame.230 2000 4000 6000 8000 10000 12000 14000distance mesured by ATS (mm)02000400060008000100001200014000distance mesured by computer vision (mm)3sec BinsEqualityRegressionFigure 6.2: Comparing the traveled distance using computer vision tech-niques and Automated Tracking System. Stars show the distance trav-eled by the mouse with a bin of 3sec. The orange line shows the equalityof two methods. The black dashed line illustrates the linear regressionof the system measurement. R value for the regression is 0.997 and theslope of the black line is 1.25.one animal.The spatial activity also can be seen in Figure 6.4. The activity of onemouse during 16min was monitored and shown in a contour form for both Auto-mated Tracking System (ATS) and Computer Vision. As we can see, the activityof the animal is higher at the corners and very low at the center. This pattern isusually interpreted as animals have not been comfortable and/or have been anxious[2]. Also a high correlation between two methods of monitoring can be seen.240 200 400 600 800Time (s)0123456Frequency(Hz)FrequencyMean=1.27Max=6.25Min=0.06Mean+STD=3.03(a)0 200 400 600 800Time (s)0123456Frequency(Hz)FrequencyMean=0.95Max=6.24Min=0.08Mean+STD=2.49(b)Figure 6.3: Frequency of reading by Automated RFID Tracking Sys-tem. Frequency can vary depending on animal activity and numberwithin the homecage. Here, one mouse and two mice were moni-tored for 16min. a) For one mouse in the homecage, maximum fre-quency of reading is 6.25, mean frequency is 1.27, and also minimumfrequency is 0.06. b) For two mice in the home cage max frequency is6.24, mean frequency is 0.95, and also minimum frequency is 0.0825Figure 6.4: Contours of activity for the animal during 16min in thehomecage. (a) Activity monitored by Automated Tracking System in-dicates that most activity is at the corners. (b) Activity monitored by theComputer Vision technique. It shows the major activity is at the left andright side of the home cage.26Chapter 7Discussion and Conclusion onAutomated Tracking SystemSeveral systems have been developed to automatically track mice in their homecage.One system was developed by San Diego Instruments (http://www.sandiegoinstruments.com/home-cage-activity-system/ )(retrieval date: 15/10/2017). They used an array of infraredbeams to track mice in the homecage. However this system only allows monitoringof one mouse at the time.The other example we can mention is Intellicage whichalso monitors animals in an isolated environment. We developed a system to trackseveral mice without the need of removing them from their own homecage. Usingthe RFID technology, monitoring the mice activity can be possible in longitudinalstudies. In [4] they made a very similar to our system. Although this system isquite similar to the one which was made by[4], using Python and Arduino code weshowed how other labs can develop a similar system for mice. We also extend thiswork by suggesting three ideas to make it more efficient in the future work.27How the Automated Tracking System can be more ef-ficient:• RFID and Computer Vision Hybrid System– Using an array of RFID can be used to track animals when theyare not moving very fast and RFID readers cannot detect theRFID tags. Also, RFID readers have a low temporal and spatialresolution (mean frequency is around 1Hz and spatial resolutionis around 5cm). On the other side, computer vision has veryhigh temporal and spatial resolution; however, using this tech-nique can fail when the animals are very close to each other.Combining these two techniques may give a very high tempo-ral and spatial resolution. As a matter of fact, while the ani-mals are moving with distances apart enough that camera de-tects them separately, two systems only update each other toensure that they are correctly labeled. While the animals arevery close to each other, the RFID system can update the on-going labeling, and then check previous labeling right beforethe animals became very close. Besides using Hungarian Algo-rithm (https://github.com/Smorodov/Multitarget-tracker), maybe helpful to validate labeling of animals.• RFID and Photo-beam Hybrid System– As mentioned in Chapter 6, Automated Tracking System withRFID readers only one RFID tag at the time with the mean fre-quency of 1Hz. In order to increase the frequency, we need togive some information to RFID readers that animals are mo-mentary standing at which area approximately. If we figure outthe area that animals are standing, we can turn on those RFIDreaders which are close to that area and then label the animalsbased on their RFID tags. Combination of an array of Infraredbeams and RFID readers may be helpful in this aim. Once some2827 beams are interrupted, we know the coordination of RFIDreaders and beams; thus, we can command those RFID to readany tags around them. Therefore, the temporal resolution willincrease.• Two RFID readers at the same time– In this system we described previously only one RFID readercan be on at the time; however, if we turn on two RFID read-ers the mean frequency can be doubled. We may need to usesmaller RFID readers such as ID12LA in order to avoid crosstalking between RFID readers.29Bibliography[1] J. Aguzzi, V. Sbragaglia, D. Sarria´, J. A. Garcı´a, C. Costa, J. d. Rı´o,A. Ma`nuel, P. Menesatti, and F. Sarda`. A new laboratory radio frequencyidentification (rfid) system for behavioural tracking of marine organisms.Sensors, 11(10):9532–9548, 2011. → pages 5[2] K. R. Bailey and J. N. Crawley. Anxiety-related behaviors in mice. 2009. →pages 24[3] R. S. Bains, H. L. Cater, R. R. Sillito, A. Chartsias, D. Sneddon, D. Concas,P. Keskivali-Bond, T. C. Lukins, S. Wells, A. A. Arozena, et al. Analysis ofindividual mouse activity in group housed animals of different inbred strainsusing a novel automated home cage analysis system. Frontiers in behavioralneuroscience, 10, 2016. → pages 4, 5[4] R. S. Bains, H. L. Cater, R. R. Sillito, A. Chartsias, D. Sneddon, D. Concas,P. Keskivali-Bond, T. C. Lukins, S. Wells, A. A. Arozena, et al. Analysis ofindividual mouse activity in group housed animals of different inbred strainsusing a novel automated home cage analysis system. Frontiers in BehavioralNeuroscience, 10, 2016. → pages 4, 27[5] M. Bak, A. Fransen, J. Janssen, J. van Os, and M. Drukker. Almost allantipsychotics result in weight gain: a meta-analysis. PloS one, 9(4):e94112,2014. → pages 2[6] M. Baker et al. Technology feature inside the minds of mice and men.Nature, 475(7354):123–128, 2011. → pages 4[7] J. P. Balcombe, N. D. Barnard, and C. Sandusky. Laboratory routines causeanimal stress. Journal of the American Association for Laboratory AnimalScience, 43(6):42–51, 2004. → pages 3[8] C. M. Bekkevold, K. L. Robertson, M. K. Reinhard, A. H. Battles, and N. E.Rowland. Dehydration parameters and standards for laboratory mice.30Journal of the American Association for Laboratory Animal Science, 52(3):233–239, 2013. → pages 2[9] S. R. Blumenthal, V. M. Castro, C. C. Clements, H. R. Rosenfield, S. N.Murphy, M. Fava, J. B. Weilburg, J. L. Erb, S. E. Churchill, I. S. Kohane,et al. An electronic health records study of long-term weight gain followingantidepressant use. JAMA psychiatry, 71(8):889–896, 2014. → pages 3[10] F. Bolan˜os, J. M. LeDue, and T. H. Murphy. Cost effective raspberrypi-based radio frequency identification tagging of mice suitable forautomated in vivo imaging. Journal of Neuroscience Methods, 276:79–83,2017. → pages 6[11] K. Branson, A. A. Robie, J. Bender, P. Perona, and M. H. Dickinson.High-throughput ethomics in large groups of drosophila. Nature methods, 6(6):451–457, 2009. → pages 4[12] X. P. Burgos-Artizzu, P. Dolla´r, D. Lin, D. J. Anderson, and P. Perona.Social behavior recognition in continuous video. In Computer Vision andPattern Recognition (CVPR), 2012 IEEE Conference on, pages 1322–1329.IEEE, 2012. → pages 4[13] E. N. Bush, R. Shapiro, V. E. Knourek-Segel, B. A. Droz, T. Fey, E. Lin,M. E. Brune, and P. B. Jacobson. Chronic treatment with eitherdexfenfluramine or sibutramine in diet-switched diet-induced obese mice.Endocrine, 29(2):375–381, 2006. → pages 3[14] L. Catarinucci, R. Colella, L. Mainetti, L. Patrono, S. Pieretti, A. Secco, andI. Sergi. An animal tracking system for behavior analysis using radiofrequency identification. Lab animal, 43(9):321, 2014. → pages 4[15] S. X. Chen, A. N. Kim, A. J. Peters, and T. Komiyama. Subtype-specificplasticity of inhibitory circuits in motor cortex during motor learning.Nature neuroscience, 18(8):1109–1115, 2015. → pages 2[16] C. U. Correll. Weight gain and metabolic effects of mood stabilizers andantipsychotics in pediatric bipolar disorder: a systematic review and pooledanalysis of short-term trials. Journal of the American Academy of Child &Adolescent Psychiatry, 46(6):687–700, 2007. → pages 3[17] F. De Chaumont, R. D.-S. Coura, P. Serreau, A. Cressant, J. Chabout,S. Granon, and J.-C. Olivo-Marin. Computerized video analysis of socialinteractions in mice. Nature methods, 9(4):410–417, 2012. → pages 431[18] R. Ganguli. Weight gain associated with antipsychotic drugs. The Journal ofclinical psychiatry, 1999. → pages 2[19] E. H. Goulding, A. K. Schenk, P. Juneja, A. W. MacKay, J. M. Wade, andL. H. Tecott. A robust automated system elucidates mouse home cagebehavioral structure. Proceedings of the National Academy of Sciences, 105(52):20575–20582, 2008. → pages 1[20] D. L. Hickman and M. Swan. Use of a body condition score technique toassess health status in a rat model of polycystic kidney disease. Journal ofthe American Association for Laboratory Animal Science, 49(2):155–159,2010. → pages 2[21] C. L. Howerton, J. P. Garner, and J. A. Mench. A system utilizing radiofrequency identification (rfid) technology to monitor individual rodentbehavior in complex social settings. Journal of Neuroscience Methods, 209(1):74–78, 2012. → pages 5[22] N. Izumo. Development of Balance for Animals, 2015. URLhttp://www.aandd.jp/support/dev stories/story35.html. → pages 15[23] H. Jhuang, E. Garrote, X. Yu, V. Khilnani, T. Poggio, A. D. Steele, andT. Serre. Automated home-cage behavioural phenotyping of mice. Naturecommunications, 1(6):68, 2010. → pages 4[24] M. Kabra, A. A. Robie, M. Rivera-Alba, S. Branson, and K. Branson. Jaaba:interactive machine learning for automatic annotation of animal behavior.nature methods, 10(1):64–67, 2013. → pages 4[25] D. L. Kelly and R. R. Conley. Thyroid function in treatment-resistantschizophrenia patients treated with quetiapine, risperidone, or fluphenazine.The Journal of clinical psychiatry, 66(1):80–84, 2005. → pages 3[26] L. Lewejohann, A. M. Hoppmann, P. Kegel, M. Kritzler, A. Kru¨ger, andN. Sachser. Behavioral phenotyping of a murine model of alzheimersdisease in a seminaturalistic environment using rfid tracking. Behaviorresearch methods, 41(3):850–856, 2009. → pages 4[27] L. Maayan and C. U. Correll. Management of antipsychotic-related weightgain. Expert review of neurotherapeutics, 10(7):1175–1200, 2010. → pages232[28] S. Mandillo, V. Tucci, S. M. Ho¨lter, H. Meziane, M. Al Banchaabouchi,M. Kallnik, H. V. Lad, P. M. Nolan, A.-M. Ouagazzal, E. L. Coghill, et al.Reliability, robustness, and reproducibility in mouse behavioralphenotyping: a cross-laboratory study. Physiological genomics, 34(3):243–255, 2008. → pages 4[29] T. H. Murphy, J. D. Boyd, F. Bolan˜os, M. P. Vanni, G. Silasi, D. Haupt, andJ. M. LeDue. High-throughput automated home-cage mesoscopic functionalimaging of mouse cortex. Nature communications, 7, 2016. → pages 4, 10[30] O. Noorshams, J. D. Boyd, and T. H. Murphy. Automating mouse weighingin group homecages with raspberry pi micro-computers. Journal ofNeuroscience Methods, 285:1–5, 2017. → pages v, 4[31] M. Paterlini, S. S. Zakharenko, L. Wen-Sung, J. Qin, H. Zhang, J. Mukai,K. G. Westphal, B. Olivier, D. Sulzer, P. Pavlidis, et al. Transcriptional andbehavioral interaction between 22q11. 2 orthologs modulatesschizophrenia-related phenotypes in mice. Nature neuroscience, 8(11):1586,2005. → pages 4[32] D. R. Reed, A. A. Bachmanov, and M. G. Tordoff. Forty mouse strainsurvey of body composition. Physiology & behavior, 91(5):593–600, 2007.→ pages 8[33] C. A. Richardson. The power of automated behavioural homecagetechnologies in characterizing disease progression in laboratory mice: Areview. Applied Animal Behaviour Science, 163:19–27, 2015. → pages 2, 4[34] E. J. Robinson, T. O. Richardson, A. B. Sendova-Franks, O. Feinerman, andN. R. Franks. Radio tagging reveals the roles of corpulence, experience andsocial information in ant decision making. Behavioral ecology andsociobiology, 63(5):627–636, 2009. → pages 4[35] A. T. Schaefer and A. Claridge-Chang. The surveillance state of behavioralautomation. Current opinion in neurobiology, 22(1):170–176, 2012. →pages 5[36] G. Silasi, D. Xiao, M. P. Vanni, A. C. Chen, and T. H. Murphy. Intact skullchronic windows for mesoscopic wide-field imaging in awake mice. Journalof neuroscience methods, 267:141–149, 2016. → pages 10[37] J. L. Silverman, M. Yang, C. Lord, and J. N. Crawley. Behaviouralphenotyping assays for mouse models of autism. Nature ReviewsNeuroscience, 11(7):490–502, 2010. → pages 433[38] A. F. Simon, M.-T. Chou, E. D. Salazar, T. Nicholson, N. Saini, S. Metchev,and D. E. Krantz. A simple assay to study social behavior in drosophila:measurement of social space within a group. Genes, Brain and Behavior, 11(2):243–252, 2012. → pages 4[39] A. D. Steele, W. S. Jackson, O. D. King, and S. Lindquist. The power ofautomated high-resolution behavior analysis revealed by its application tomouse models of huntington’s and prion diseases. Proceedings of theNational Academy of Sciences, 104(6):1983–1988, 2007. → pages 4[40] N. A. Swierczek, A. C. Giles, C. H. Rankin, and R. A. Kerr.High-throughput behavioral analysis in c. elegans. Nature methods, 8(7):592–598, 2011. → pages 4[41] S. Tordjman, D. Drapier, O. Bonnot, R. Graignic, S. Fortes, D. Cohen,B. Millet, C. Laurent, and P. L. Roubertoux. Animal models relevant toschizophrenia and autism: validity and limitations. Behavior genetics, 37(1):61–78, 2007. → pages 4[42] V. Tucci, A. Hardy, and P. M. Nolan. A comparison of physiological andbehavioural parameters in c57bl/6j mice undergoing food or water restrictionregimes. Behavioural brain research, 173(1):22–29, 2006. → pages 2[43] M. H. Ullman-Cullere´ and C. J. Foltz. Body condition scoring: a rapid andaccurate method for assessing health status in mice. Comparative Medicine,49(3):319–323, 1999. → pages 2[44] E. A. Ulman, D. Compton, and J. Kochanek. Measuring food and waterintake in rats and mice. ALN magazine, (October):17–20, 2008. → pages 15[45] A. Weissbrod, A. Shapiro, G. Vasserman, L. Edry, M. Dayan, A. Yitzhaky,L. Hertzberg, O. Feinerman, and T. Kimchi. Automated long-term trackingand social behavioural phenotyping of animal colonies within a semi-naturalenvironment. Nature communications, 4:2018, 2013. → pages 4, 5[46] T. Wetterling. Bodyweight gain with atypical antipsychotic. Drug Safety, 24(1):59–73, 2001. → pages 2, 3[47] M. Yang, J. L. Silverman, and J. N. Crawley. Automated three-chamberedsocial approach task for mice. Current protocols in neuroscience, pages8–26, 2011. → pages 434[48] X. Yu, A. D. Steele, V. Khilnani, E. Garrote, H. Jhuang, T. Serre, andT. Poggio. Automated home-cage behavioral phenotyping of mice. 2009. →pages 4[49] K. Zarringhalam, M. Ka, Y.-H. Kook, J. I. Terranova, Y. Suh, O. D. King,and M. Um. An open system for automatic home-cage behavioral analysisand its application to male and female mouse models of huntington’sdisease. Behavioural brain research, 229(1):216–225, 2012. → pages 435Appendix AAutomated Tracking SystemCodesCreated on Tue Apr 18 12:30:27 2017@author: Omid Noorshams””””””Import important libraries.”””from smbus2 import SMBusWrapperimport timeimport RPi.GPIO as GPIOimport decimalfrom random import shufflefrom picamera import PiCamera”””Define the camera object.”””36camera = PiCamera()”””Predefine incorrect data.”””Trash Data = [0,255,255,255,255,255,255,255,255,255,255,255]”””Define lists of ProTrinket I2C outputs.”””ProT = [None]*18ProT [0] = 0x11ProT [1] = 0x12ProT [2] = 0x13ProT [3] = 0x14ProT [4] = 0x15ProT [5] = 0x16ProT [6] = 0x31ProT [7] = 0x32ProT [8] = 0x33ProT [9] = 0x34ProT [10] = 0x35ProT [11] = 0x36ProT [12] = 0x51ProT [13] = 0x52ProT [14] = 0x53ProT [15] = 0x54ProT [16] = 0x55ProT [17] = 0x56”””Define location of RIFD readers.”””ProT Dic = ProT[0]:”1-1”,ProT[1]:”1-2”,ProT[2]:”1-3”,ProT[3]:”1-4”,37ProT[4]:”1-5”,ProT[5]:”1-6”,ProT[6]:”2-1”,ProT[7]:”2-2”,ProT[8]:”2-3”,ProT[9]:”2-4”,ProT[10]:”2-5”,ProT[11]:”2-6”,ProT[12]:”3-1”,ProT[13]:”3-2”,ProT[14]:”3-3”,ProT[15]:”3-4”,ProT[16]:”3-5”,ProT[17]:”3-6””””Arranging the RIFD readers based on their priority to be turned on.”””ProT arrange = [i for i in range(0,18)]”””A number that is sent to ProTrinkets as a flag to turn on the RFID readers.”””def writeNumber (value,address 1):”””Use the SMBusWrapper to write bytes in ProTrinkets.”””try:with SMBusWrapper(1) as bus:bus.write byte data (address 1,0,value)except IOError as e:print (e)return -1def readNumber(address 1):number1 = []”””Use the SMBusWrapper to read bytes from ProTrinkets.”””try:with SMBusWrapper(1) as bus:number1 = bus.read i2c block data(address 1,0,12)38time.time()except IOError as e:print (e)return number1,time.time()var = int(1)”””Take a picture from the empty homecage as a background.”””camera.capture (Pathway to save camera recordings)”””Sleep for 15 seconds to have enough time to locate the animals inside thehomecage.”””time.sleep(15)”””Hold the starting time.”””t0 = time.time()t = time.time()for j in range(10):camera.start recording (Pathway to save camera recordings)”””Define the resolution of camera and its frame rate.”””camera.resolution = (500,312)camera.framerate = 30with open (text file pathway to save data , ”w”) as f:f.write (str(t0)+”/n”)”””Define the time duration of each trail = 500sec.”””while t-t0 < 500:39”””Record time of each reading.”””t=time.time()”””Start sending a turn on flag to ProTrinkets and receiving data from them.”””for i in ProT arrange:Data = Trash Datatry:with SMBusWrapper(1) as bus:bus.write byte data (ProT[i],0,var)time.sleep (0.123)number1 = bus.read i2cblock data(ProT[i],0,12)except IOError as e:print (e)Time = time.time()Data = number1”””If received data is not equal to incorrect data, then write the the location ofanimal, its RFID , and the time of reading in a text file.”””if (Data != Trash Data and Data != []):f.write (str(Data[len(Data)-5:len(Data)])+””+ProT Dic[ProT[i]]+” ”+str(Time)+”/n”)print (ProT Dic[ProT[i]])print (Data,Time)print”””Close the camera recording at the end of the trail.”””40camera.stop recording()”””Import the important libraries (Wire.h).”””#include <Wire.h >”””Define the slaves I2C address (e.g. 0X33).”””define SLAVE ADDRESS 0x33”””Number of RFID tag ID bytes = 16.”””const int totalLength = 16;”””Predefine an array to hold the tag ID.”””char Tag [totalLength];int bytesread = 0;char dumbChar;”””Predefine a byte which ProTrinkets receive from Raspberry Pi to turn onRFID readers.”””volatile byte masterByte = 0;volatile boolean flag = false;void setup (){41”””Setup the serial parameters and the Timeout time duration of serial port.”””Serial.begin (9600);Serial.setTimeout (120);”””Set pin number 9 as an output and make it Low.”””pinMode (9,OUTPUT);pinMode (A3,OUTPUT);digitalWrite (9,LOW);”””Setup the I2C address and predefine two functions to have direct accessbetween ProTrinkets and Raspberry Pi.”””Wire.begin (SLAVE ADDRESS);Wire.onReceive (receiveData);Wire.onRequest (sendData);}void loop (){masterByte = 0;receiveData (1);Serial.println(masterByte);”””If ProTrinket receives the turn on flag from Raspberry Pi then make the pin9 High.”””if (masterByte == 1){digitalWrite (9,HIGH);”””42If serial port is available then read the data from output.”””if (Serial.available () >0){Serial.readBytesUntil (3,Tag,totalLength-1);digitalWrite (9,LOW);sendData ();}}digitalWrite (9,LOW);}”””Define a function to receive data from Raspberry Pi.”””void receiveData (int byteCount){while (Wire.available ())”””Read data from Raspberry Pi.”””masterByte = Wire.read ();}”””Define a function for sending data to Raspberry Pi.”””void sendData (){”””If pin number 4 which is connected to TIR pin of RFID reader then send 12bytes to Raspberry Pi.”””if (digitalRead (4) == true)43Wire.write ((byte *)Tag,12);}”””Created on Fri Jul 28 14:23:25 2017@author: Omid Noorshams””””””USAGEpython motion detectorimport the necessary packages””””””Import necessary libraries.”””import argparseimport datetimeimport imutilsimport timeimport cv2import mathimport numpy as npfrom collections import dequeimport matplotlib.pyplot as pltimport matplotlibfrom itertools import productfrom bisect import bisect left”””44Predefine deques to hold values in 256 bytes arrays. pts is a deque to holdvalue of center of contours.”””ap = argparse.ArgumentParser()ap.add argument(”v”, ”video”, help=”path to the video file”)ap.add argument(”a”, ”minarea”, type=int, default=3000, help=”minimumarea size”)ap.add argument(”b”, ”buffer”, type=int, default=256, help=”max buffersize”)args = vars(ap.parse args())pts = deque(maxlen=args[”buffer”])”””Predefine a list to hold RFID tag IDs.”””Tags = []const dist = 5File name = ”M2 1”filename = ”Pathway of the text file in which data have been saved withraspberry Pi””””Open the text file in which data have been saved with raspberry Pi and readline by line. Then hold tags ID, time of reading, and locations of animals.”””with open(”Pathway of the text file in which data have been saved withraspberry Pi”,”r”) as f:T0 = f.readline()for line in f:f contents = f.readline ()Tag = str(f contents [1:19]).replace(” ” ””)Tags.append (Tag)Tags = set (Tags)Tags = list(Tags)45f.close()”””Dump data for each Tag ID in text files.”””for Tagg in Tags:with open(”Pathway to dump data in a text file”, ”w”) as f p:with open(””Pathway to read data from text file in whichdata has been saved from Raspberry Pi” , ”r”) as f:for line in iter(f):Tag = str(line [1:19]).replace(” ” ””)line = Tag + line [20:len(line)]if (Tagg == Tag):f p.write (line)f.close()f p.close()”””Predefine lists to hold time and locations.”””XX = []YY = []T = []for Tagg in Tags:with open (”RTS test post ”+Tagg+” VD.txt”, ”w”) as f VD:with open (”C:/Users/user/Documents/omid/RFIDgrid tracking/data/rerts vid pic tex/RTS test ”+File name+” post ”+Tagg+”.txt”, ”r”) as f p:for i,line in enumerate(f p):if i == 0:line 0 = lineTime 0 = line [15:len(line 0)-2]Y0 = int(line 0 [11])X0 = int(line 0 [13])46line 0 = lineTime 1 = line [15:len(line)-2]Y1 = int(line [11])X1 = int(line [13])XX.append (Y1)YY.append (X1)T.append (float(Time 1[6:13]))d = math.sqrt ((X1-X0)**2 +(Y1-Y0)**2)*int(const dist)if (i ¿ 0):v = d/(float(Time 1)-float(Time 0))f VD.write (str(d)+” ”+str(v)+” /n”)”””Define camera object to capture videos.”””camera = cv2.VideoCapture(Pathway to save videos)”””import the first frame as a background.”””firstFrame = NonefirstFrame = cv2.imread(first frame pathway to import)”””Another way to find the first frame from the very first frame of main videos.”””for i in range(3):( ,firstFrame) = camera.read()”””Edit the size and color of the first frame. Color should be switched to gray.Then use Gaussian filter to reduce spatial noise.”””firstFrame = imutils.resize(firstFrame, width=500)47firstFrame = cv2.cvtColor(firstFrame, cv2.COLOR BGR2GRAY)firstFrame = cv2.GaussianBlur(firstFrame, (31,31),0)”””Find the size of each frame .”””Y = np.shape (firstFrame)[0]X = np.shape (firstFrame)[1]”””Divide width and length of frames by 6 and 3 (number of RFID readers inwidth and length), respectively which give us the spatial resolution”””R Y = Y/3R X = X/6”””Predefine the size of home cage (mm).”””cage X = 280cage Y = 180”””Find the size of pixels in X and Y.”””pix X = cage X / Xpix Y = cage Y / Y”””Define a dictionary to hold the location of RFID readers.”””R pos = 11:(R X/2,R Y/2),12:(R X*1.5,R Y/2),13:(R X*2.5,R Y/2),14:(R X*3.5,R Y/2),15:(R X*4.5,R Y/2),16:(R X*5.5,R Y/2),21:(R X/2,R Y*1.5),22:(R X*1.5,R Y*1.5),23:(R X*2.5,R Y*1.5),24:(R X*3.5,R Y*1.5),25:(R X*4.5,R Y*1.5),26:(R X*5.5,R Y*1.5),4831:(R X/2,R Y*2.5),32:(R X*1.5,R Y*2.5),33:(R X*2.5,R Y*2.5),34:(R X*3.5,R Y*2.5),35:(R X*4.5,R Y*2.5),36:(R X*5.5,R Y*2.5)”””Predefine the location of RFID readers in a list as a code. 11 means readerat row one and column one. 12 mean reader at row one and column 2, andso on.”””R = [11,12,13,14,15,16,21,22,23,24,25,26,31,32,33,34,35,36]”””Define arrays as deques to hold the position (location) and time of RFIDreaders which have been read from text file.”””Pos deq = deque (maxlen = len ( XX ) )T deq = deque (maxlen = len ( XX ) )R Pos deq = deque (maxlen = len ( XX ) )for i in range(1,len(XX)):”””Find the time of reading for each tag ID and subtract from the initial time.”””Pos deq.append ((XX[i],YY[i]))T trunc = str(float(T[i])-float(T[1]))T deq.append (T trunc[0:5])T [i] = T[i]-T[0]”””Pop out the first line of text file since it not related to the main data.”””T.pop(0)T.insert (0,0)”””Predefine a list to hold the traveled distance by animals measured by Auto-mated Tracking System.49”””sum d = []Pos dum = (0,0)counter = 0d = 0sumd = 0”””predefine of the frame counter.”””counter 2 = 0”””Predefine of a list to hold frames value in the loop.”””frames = []”””Frequency of videos.”””T frame = 1/30T frames = []evrey frames = 100”””Save videos in avi files. first one for main video to record animals move-ment, tracking curves, positions of RFID readers which have been recordedby Automated Tracking System. Second one to record the movement of ani-mals after applying Gaussian filter, switching to gray color and subtractingfrom the background. Third file is used to save video of animals movementafter applying binary for a specific threshold.”””video 1 = cv2.VideoWriter(filename+”.avi”,cv2.VideoWriter fourcc(*’XVID’), 30,(X, Y))video 2 = cv2.VideoWriter(filename+”gray.avi”,cv2.VideoWriter fourcc(*’XVID’), 30,(X, Y))50video 3 = cv2.VideoWriter(filename+”delta.avi”,cv2.VideoWriter fourcc(*’XVID’), 30,(X, Y))”””A while loop to read frames from the video and analyze them one by one tofind the location of the animals,...”””while True:”””Read from camera object and dump the frames in frame list and apply thechange to a boolean parameter to check whether there was any actual framefor each reading. If parameter grabbed is False the loop will be broken andit will read from next frame.”””(grabbed, frame) = camera.read()if not grabbed:break”””Edit each frame size (here width is assigned to be 500). Color should begray. After applying gray function to the frame, apply Gaussian filter toreduce spatial noise. Then using ”absdiff” subtract the first frame (back-ground) from the current frame. After a function needs to be used assignvalues in the current frame (after applying previous steps) 0 or 1. The valuebigger than a specific threshold should be 0 and less than that should be 1.”””frame = imutils.resize (frame, width=500)gray = cv2.cvtColor (frame, cv2.COLOR BGR2GRAY)gray = cv2.GaussianBlur (gray, (21, 21), 10)frameDelta = cv2.absdiff(firstFrame, gray)thresh = cv2.threshold(frameDelta,100, 255, cv2.THRESH BINARY)[1]”””51Using function ”dilate” and ”erode”, noise can be reduced. This techniquegives a better contour with less sharp edges.”””thresh = cv2.dilate(thresh, None, 1)thresh = cv2.erode(thresh, None, 50)”””Find the contour for each frames with ”cv2.findcontours”, ”cnts” showsthe lists of values for each contours. Here we only need the biggest contourfor one animal and two biggest contour for two animals, and so on.”””(im,cnts,heir) = cv2.findContours(thresh.copy(),cv2.RETR EXTERNAL,cv2.CHAIN APPROX SIMPLE)”””Draw 5 vertical and 2 horizontal lines on each frame to schematically depictRFID readers.”””for i in range(5):for j in range(2):cv2.line(frame,(int(X),int((j+1)*R Y)),(0,int((j+1)*R Y)), (0,255,0), 2)cv2.line(frame,(int((i+1)*R X),int(Y)),(int((i+1)*R X),0), (0,255,0), 2)”””Open a loop through each contour.”””for c in cnts:”””If the size of contour is less than the threshold then continue.”””if cv2.contourArea(c) ¡ args[”min area”]:continue52”””Find the X and Y coordination of starting point of a surrounding rectangulararea around the contour, also hold the width and hight of this rectangle.”””(x, y, w, h) = cv2.boundingRect(c)”””Hold the center of rectangle as the center of contour.”””pts.appendleft ((x + int(w/3), y + int(h/3)))”””Predefine the distance of travel for one frame to another. Empty list.”””dis = []”””Using a loop find the position in ”R” list (as mentioned previously) ”num”is the index of each one and ”pos” is the code of position, e.g. 11, 12,13. find the distance between center of contour and all of the RFID readerslocation.”””for num , pos in enumerate (R,start=1):dis.append(math.sqrt ( ( R pos [pos][0]-(x+int(w/2)) )**2+ ( R pos [pos][1]-(y+int(h/2)) )**2 ))”””Find the minimum distance of center and RFID readers positions. Basedon the index and code and whether this code is in the very latest line ofAutomated Tracking System text file which has been already hold in a list;Draw a filled green circle at the center.”””min pos = min (dis)inx = dis.index (min pos)flag = Falseif len(R Pos deq) == 0:53R Pos deq.appendleft (str(R [inx]))if Pos deq [0]!=[]:if str(Pos deq[0][0])+str(Pos deq[0][1]) == str(R [inx]):flag = Truecv2.circle (frame, (int(R pos[R [inx]][0]),int(R pos[R [inx]][1]))“ , 10, (255,100,90),10 )R Pos deq.appendleft (str(R [inx]))”””Put the tag ID as a text adjacent to the center.”””cv2.putText(frame, Tag[-2:].format(text),(x + int(w/2), y + int(h/2)),cv2.FONT HERSHEY SIMPLEX,0.75, (0, 0, 255), 2)cv2.circle (frame, (y+int(h/2),x+int(w/2)),4, (100,30,90) )Pos deq.append([])else:cv2.circle (frame, ( int(R pos[int( R Pos deq [0]) ][0]),int(R pos[int( R Pos deq [0]) ][1]) ),3, (0,255,0),10 )”””If the list of position is not empty and at least two points have been recorded,in a for loop from the first position to the last draw lines sequentially (firstone to the second and second position to the third and so on).”””for i in range (1,len(R Pos deq)):if R Pos deq[i-1] is None and R Pos deq[i] is None:continuecv2.line (frame, ( int(R pos[int( R Pos deq [i-1]) ][0]),54int(R pos[int( R Pos deq [i-1]) ][1]) ),( int(R pos[int( R Pos deq [i]) ][0]),int(R pos[int( R Pos deq [i]) ][1]) ),(255,0,0), 5)”””Draw a rectangle around the contour to depict the approximate situation ofthe animal.”””for i in np.arange(1, len(pts)):if pts[i - 1] is None or pts[i] is None:continuecv2.line(frame, pts[i - 1], pts[i], (140, 70, 100), 2)cv2.rectangle(frame, (x, y), (x + w, y + h), (0, 255, 0), 2)”””Open and show the videos in three different windows. If the key ”Q” ispressed break the main while loop.”””cv2.imshow(”1”, frame)cv2.imshow(”Thresh”, thresh)cv2.imshow(”Frame Delta”, frameDelta)key = cv2.waitKey(1) & 0xFFif key == ord(”q”):break”””Release the camera. Then close the windows.”””camera.release()cv2.destroyAllWindows()55Appendix BAutomated Tracking SystemSchematic561122334455667788H HG GF FE ED DC CB BA AEnixLABRPi.SchDoc4/20/2017 00/ATitle:Project:PawelKDate:File:Nr: Rev.Sheet: ofRPi, Level shifters and Voltage regulatorsTracking System with RFID GridDrawn by:PK200417/A1 3SDASCLRESExpIO_Int0ExpIO_Int2ExpIO_Int1TP05U_GridMux GridMux.SchDocGPIO-023Gnd93.3V1GPIO-035GPIO-047GPIO-1711GPIO-2713GPIO-22153.3V17GPIO-1019GPIO-0921GPIO-1123Gnd25ID-SD27GPIO-0529GPIO-0631GPIO-1333GPIO-1935GPIO-2637Gnd39Gnd 6+5V 2GPIO-14 8GPIO-24 18GPIO-18 12GPIO-23 16GPIO-07 26GPIO-25 22GPIO-08 24Gnd 20ID-SC 28GPIO-12 32GPIO-21 40GPIO-16 36GPIO-20 38Gnd 30+5V 4GPIO-15 10Gnd 14Gnd 34M1 RPI-2GND+5VC20.1uRp21k60GNDC10.1u+3.3VRp11k60RPi_SDARPi_SCLGPIO_18GPIO_20GPIO_21SDASCLExpIO_Int0S1RESETRESGNDVcB 14B1 13A12VcA1OE8B2 12A23B3 11A34B4 10A45U2 TXB0104PWRGND+3.3V +5VRPi_SDARPi_SCLGPIO_18C60.1uGPIO_20GPIO_21C70.1u01 02 03 04 05 0607 08 09 10 11 1213 14 15 16 17 18RFID Grid12J1+12V-C130.1uC120.33uVi1G2Vo 3U4 L7805CD2TD230WQ04FN +5VGNDR5200R0D3+5VMH56x3.5mmMH2 MH3MH4MH1MH6Add switch on/offVzVi1G2Vo 3U3 L7808CD2TGNDC90.1uR4200R0D1+8V+C10100u/16VC80.33uS2Power TP/RFIDS3Power On/Off+C14100u/16V+C1147u/25VGND+3.3V+5VC30.1uC40.1uExpIO_Int1ExpO_Int2C50.1uR310k0 +5VVcB 7B1 8A15VcA3OE6B2 1A24U1 TXB0102DCURTP05GPIO_17GPIO_17I2C Pullup resistors RpRp min = (5-0.4)V/3mA = 1.53kPIC101 PIC102 COC1 PIC201 PIC202 COC2 PIC301 PIC302 COC3 PIC401 PIC402 COC4 PIC501 PIC502 COC5 PIC601 PIC602 OC6 PIC701 PIC702 OC7 PIC801 PIC802 OC8 PIC901 PIC902 OC9 PIC1001 PIC1002 OC1  PIC1101 PIC1102 COC11 PIC1201 PIC1202 COC12 PIC1301 PIC1302 COC13 PIC1401 PIC1402 COC14 PID101 PID102 COD1 PID201 PID202 3COD2 PID301 PID302 COD3 PIJ101 PIJ102 COJ1 PIM101 PIM102 PIM103 PIM104 PIM105 PIM106 PIM107 PIM108 PIM109 PIM1010 PIM1011 PIM1012 PIM1013 PIM1014 PIM1015 PIM1016 PIM1017 PIM1018 PIM1019 PIM1020 PIM1021 PIM1022 PIM1023 PIM1024 PIM1025 PIM1026 PIM1027 PIM1028 PIM1029 PIM1030 PIM1031 PIM1032 PIM1033 PIM1034 PIM1035 PIM1036 PIM1037 PIM1038 PIM1039 PIM1040 COM1 PIMH100 COMH  PIMH200 COMH  PIMH300 COMH  PIMH400 COMH  PIMH500 COMH  PIMH600 COMH  PIR301 PIR302 COR3 PIR401 PIR402 COR4 PIR501 PIR502 COR  PIRp101 PIRp102 CORp1 PIRp201 PIRp202 CORp2 PIS101 2PIS103 4PIS105 6COS1 PIS201 PIS202 PIS203 COS2 PIS301 PIS302 PIS303 COS3 PIU101 PIU102 PIU103 PIU104 PIU105 PIU106 PIU107 PIU108 COU1 PIU201 PIU202 PIU203 PIU204 PIU205 PIU207 PIU208 PIU2010 PIU2011 PIU2012 PIU2013 PIU2014 COU2 PIU301 PIU302 PIU303 COU3 PIU401 PIU402 PIU403 COU4 NLExpIO0  NLExpIO0I t  NLExpO0I t  NLGPIO0  NLGPIO0  NLGPIO0  NLGPIO0  NLR\E\S\ NLRPi0SC  NLRPi0SD  NLSC  NLSDA NLTP05 571122334455667788H HG GF FE ED DC CB BA AEnixLABGridMux.SchDoc4/20/2017 00/ATitle:Project:PawelKDate:File:Nr: Rev.Sheet: ofRFID Modules & I/O ExpandersTracking System with RFID GridDrawn by:PK200417/A2 3TP_INTTP_05RESSDASCLU_TP_RFID_01TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_02TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_03TP_RFID.SchDocSCLRESSDARESSDASCLTP_05TP_INTU_TP_RFID_04TP_RFID.SchDocSCLRESSDARESSDASCLTP_05TP_INTU_TP_RFID_05TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_06TP_RFID.SchDocSCLRESSDASCLRESSDASDASCLRESGND+5VC170.1uTP_05RESSDASCLTP_INTU_TP_RFID_07TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_08TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_09TP_RFID.SchDocSCLRESSDARESSDASCLTP_05TP_INTU_TP_RFID_10TP_RFID.SchDocSCLRESSDARESSDASCLTP_05TP_INTU_TP_RFID_11TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_12TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_13TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_14TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_15TP_RFID.SchDocSCLRESSDARESSDASCLTP_05TP_INTU_TP_RFID_16TP_RFID.SchDocSCLRESSDARESSDASCLTP_05TP_INTU_TP_RFID_17TP_RFID.SchDocSCLRESSDATP_05RESSDASCLTP_INTU_TP_RFID_18TP_RFID.SchDocSCLRESSDAExpIO_Int0ExpIO_Int2ExpIO_Int1SCL1SDA2A05GP0 10Vss 9Vdd 18A23A14Rst6Int8GP1 11GP2 12GP3 13GP4 14GP5 15GP6 16GP7 17U5 MCP23008-E/SOSCLSDAGND+5VC180.1uSCL1SDA2A05GP0 10Vss 9Vdd 18A23A14Rst6Int8GP1 11GP2 12GP3 13GP4 14GP5 15GP6 16GP7 17U6MCP23008-E/SOSCLSDAGND+5VC190.1uSCL1SDA2A05GP0 10Vss 9Vdd 18A23A14Rst6Int8GP1 11GP2 12GP3 13GP4 14GP5 15GP6 16GP7 17U7 MCP23008-E/SOSCLSDAR810k0ExpIO_Int0ExpIO_Int1ExpIO_Int2RESRESRESGNDGNDGND+5VR910k0+5VTP05TP05TP05TP05TP05TP05TP05TP05TP05TP05TP05TP05TP05TP05TP05TP05TP05TP05TP05 TP05R1010k0+5VI2C Addr = 1[001]I2C Addr = 2[010]I2C Addr = 3[011]I2C Addr = 0[000] is reserved for TrinketPRO general callPIC1701 PIC1702 COC17 PIC1801 PIC1802 OC1  PIC1901 PIC1902 COC19 PIR801 PIR802 COR8 PIR901 PIR902 COR9 PIR1001 PIR1002 COR10 PIU501 PIU502 PIU503 PIU504 PIU505 PIU506 PIU508 PIU509 PIU5010 PIU5011 PIU5012 PIU5013 PIU5014 PIU5015 PIU5016 PIU5017 PIU5018 COU5 PIU601 PIU602 PIU603 PIU604 PIU605 PIU606 PIU608 PIU609 PIU6010 PIU6011 PIU6012 PIU6013 PIU6014 PIU6015 PIU6016 PIU6017 PIU6018 COU6 PIU701 PIU702 PIU703 PIU704 PIU705 PIU706 PIU708 PIU709 PIU7010 PIU7011 PIU7012 PIU7013 PIU7014 PIU7015 PIU7016 PIU7017 PIU7018 COU7 NLExpIO0I t  POEx O0Int0 NLExpIO0Int1 POEx O0Int1 NLExpIO0Int2 POEx O0Int2 NLR\E\S\ PO \E\S\ NLSCL PO CL NLSDA PO DA NLTP05 PO P05 581122334455667788H HG GF FE ED DC CB BA AEnixLABTP_RFID.SchDoc4/20/2017 00/ATitle:Project:PawelKDate:File:Nr: Rev.Sheet: ofTrinketPRO and RFID ModuleTracking System with RFID GridDrawn by:PK200417/A3 3232124222019181716151413512346789101112TPro1TrinketPROGnd 1Reset 2N/C3N/C4CP 5TR6FS 7Vcc 11D18D09ReadLED 10RFID1 ID-20LATP_INTTP_05RESSDAGNDVzGNDC160.1uID_DataC150.1uID_TRR6200R0D4LedRGNDID_RLID_VccID_DataID_ResetID_RLLEDID_TRID_RLSCLID_ResetID_ResetID_VccID_VccActive: Square wave 3.3 kHzTag in RangeDO NOT put a copper ground plane under ID-20LAI=45mAIp= (50-150)mARESRESTP_INTTP05R7200R0GNDD5LedBOPTIONALPIC1501 PIC1502 OC15 PIC1601 PIC1602 OC16 PID401 PID402 COD4 PID501 PID502 COD5 PIR601 PIR602 COR6 PIR701 PIR702 COR7 PIRFID101 PIRFID102 PIRFID103 PIRFID104 PIRFID105 PIRFID106 PIRFID107 PIRFID108 PIRFID109 PIRFID1010 PIRFID1011 CORFID1 PITPro101 PITPro102 PITPro103 PITPro104 PITPro105 PITPro106 PITPro107 PITPro108 PITPro109 PITPro1010 PITPro1011 PITPro1012 PITPro1013 PITPro1014 PITPro1015 PITPro1016 PITPro1017 PITPro1018 PITPro1019 PITPro1020 PITPro1021 PITPro1022 PITPro1023 PITPro1024 COTPro1 NLID0D  NLID0Reset NLID0R  NLID0TR NLID0Vcc NLLED PO DA PO CL NLR\E\S\ POR\ \S\ NLTP05 PO P 05 NLTP0IN  PO 0INT 5960

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0362878/manifest

Comment

Related Items