<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Department of Electrical and Computer and Engineering &#187; Projects</title>
	<atom:link href="http://departments.digipen.edu/ece/category/projects/feed/" rel="self" type="application/rss+xml" />
	<link>http://departments.digipen.edu/ece</link>
	<description>DigiPen Institute of Technology</description>
	<lastBuildDate>Wed, 02 Sep 2015 05:41:21 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=4.1.42</generator>
	<item>
		<title>Music Instrument Training Device</title>
		<link>http://departments.digipen.edu/ece/music-instrument-training-device/</link>
		<comments>http://departments.digipen.edu/ece/music-instrument-training-device/#comments</comments>
		<pubDate>Thu, 23 Apr 2015 05:31:05 +0000</pubDate>
		<dc:creator><![CDATA[jthomas1]]></dc:creator>
				<category><![CDATA[Projects]]></category>

		<guid isPermaLink="false">http://departments.digipen.edu/ece/?p=115</guid>
		<description><![CDATA[The purpose of this project is to make instrument learning software available to anyone and for any instrument. We present a hardware implementation of a sound to MIDI converter. A Piezo sensor is used to get better quality sound data from acoustic instruments. By combining a Piezo pickup, a high performance microcontroller, and Fast Fourier &#8230; <a href="http://departments.digipen.edu/ece/music-instrument-training-device/" class="more-link">Continue reading <span class="screen-reader-text">Music Instrument Training Device</span></a>]]></description>
				<content:encoded><![CDATA[<p>The purpose of this project is to make instrument learning software available to anyone and for any instrument. We present a hardware implementation of a sound to MIDI converter. A Piezo sensor is used to get better quality sound data from acoustic instruments. By combining a Piezo pickup, a high performance microcontroller, and Fast Fourier Transforms we can determine the notes played on the instrument. The paper will also discuss the techniques used to get more accurate data, such as Harmonic Product Sum from the Fast Fourier data. The use of HAL drivers will be discussed to allow programming in C++ on the STM32F4 ARM chip. Power consumption is another major topic that will be discussed, as well as sending data wirelessly with XBEE.</p>
<p class="wp-more-tag mce-wp-more" title="Read more..."><span id="more-115"></span></p>
<p class="wp-more-tag mce-wp-more" title="Read more..."><a href="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/09/Music-Trainer-PCB.png"><img class="aligncenter size-full wp-image-125" src="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/09/Music-Trainer-PCB.png" alt="Music Trainer PCB" width="790" height="609" /></a> The device can clip on to an instrument and pick up the vibrations from the instrument using the Piezo sensor. The built in processor calculates the note that the player of the instrument is playing and send it out wirelessly in MIDI format. This means is that any instrument learning software that Uses MIDI format can be used with this hardware. Aside from the hardware there is custom software for this project made by Shivam Kumar, a Software Engineer student at DigiPen. This software works with the hardware to help teach the player how to play their instrument. The software has been developed for Windows operating system with the hope of supporting Android, Mac and Linux in the future. One mode of this software will have random sheet music notes scroll in from the right and the player will need to hit the correct note before it reaches the far left. This is just one of many modes that the software allows. A few more modes consist of ear training and song practice. The software works separately from the hardware and is usable by other MIDI devices but it is designed to work especially well with the hardware that is being described in this paper.</p>
<p><a href="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/09/Cody-Harris-ECE410.pdf">Cody Harris &#8220;A Musical Instrument Training Device&#8221; Digipen Report, 17 pages, 2015, (PDF)</a></p>
]]></content:encoded>
			<wfw:commentRss>http://departments.digipen.edu/ece/music-instrument-training-device/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Project RADAR</title>
		<link>http://departments.digipen.edu/ece/project-radar/</link>
		<comments>http://departments.digipen.edu/ece/project-radar/#comments</comments>
		<pubDate>Thu, 24 Apr 2014 01:22:12 +0000</pubDate>
		<dc:creator><![CDATA[jthomas1]]></dc:creator>
				<category><![CDATA[Projects]]></category>

		<guid isPermaLink="false">http://departments.digipen.edu/ece/?p=59</guid>
		<description><![CDATA[Student: Kevin Secretan The goal of Project RADAR is to use radar imaging to possibly detect human survivors in burning or collapsed buildings. The project focuses on using Synthetic Aperture Radar (SAR), as SAR images are a useful tool for picking out possible humans. Recording the information for a SAR image using this radar system requires moving &#8230; <a href="http://departments.digipen.edu/ece/project-radar/" class="more-link">Continue reading <span class="screen-reader-text">Project RADAR</span></a>]]></description>
				<content:encoded><![CDATA[<p>Student: Kevin Secretan</p>
<p>The goal of Project RADAR is to use radar imaging to possibly detect human survivors in burning or collapsed buildings. The project focuses on using Synthetic Aperture Radar (SAR), as SAR images are a useful tool for picking out possible humans. Recording the information for a SAR image using this radar system requires moving the radar in a straight line over ten feet, in two inch increments, acquiring range data at each point. To move the radar automatically a time lapse dolly was constructed out of PVC pipes, rollerblade wheels attached to a platform, and a DC motor, driven by a MSP430 microcontroller.<br />
<span id="more-59"></span></p>
<figure id="attachment_78" style="width: 880px;" class="wp-caption aligncenter"><a href="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/08/RADAR-Dolly.png"><img class="size-full wp-image-78" src="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/08/RADAR-Dolly.png" alt="CAD drawing of the RADAR Dolly" width="880" height="691" /></a><figcaption class="wp-caption-text">CAD drawing of the RADAR Dolly</figcaption></figure>
<p>Project RADAR initially focused on understanding and duplicating one of MIT&#8217;s OpenCourseWare projects, which describes how to create a simple radar system for a little over $300 that can sense moving target speeds and moving target ranges.  New Python code processes the radar&#8217;s data in real-time, and shows live range spectrograms to users wirelessly through their web browser. The web interface decouples the display of data from the physical  location of the processor, allowing all software to be run on an embedded device.</p>
<p>By collecting range information in a certain way, a user can generate Synthetic Aperture Radar (SAR) images. A Python version of the algorithm which makes the SAR images was written, because MIT&#8217;s Matlab script would take many minutes on a powerful PC to process the data, and when using Octave on an even more powerful PC, the script would take half an hour or more. Our Python implementation of the SAR image generation algorithm produces equivalent outputs to the Matlab version.</p>
<figure id="attachment_76" style="width: 1024px;" class="wp-caption aligncenter"><a href="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/08/Radar-PCB.png"><img class="size-full wp-image-76" src="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/08/Radar-PCB.png" alt="Project RADAR Printed Circuit Board" width="1024" height="768" /></a><figcaption class="wp-caption-text">Project RADAR Printed Circuit Board</figcaption></figure>
<p>A Printed Circuit Board (PCB) was created to reduce weight and size for portability and to improve reliability. An embedded processor board was used to replace the processing computer, so the radar can function as its own standalone unit. It only requires a network connection so that other devices can see its data and make the radar perform certain actions. Cantennas from Quonset Microwave were used and seem to give slightly better results than the panel directional antennas.</p>
<p><a href="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/08/Secretan-Project-RADAR.pdf">Secretan, K. &#8220;Project RADAR&#8221;, Digipen report, 33 pages, 2014 (PDF)</a></p>
]]></content:encoded>
			<wfw:commentRss>http://departments.digipen.edu/ece/project-radar/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
		<item>
		<title>Direct Abstraction</title>
		<link>http://departments.digipen.edu/ece/direct-abstraction/</link>
		<comments>http://departments.digipen.edu/ece/direct-abstraction/#comments</comments>
		<pubDate>Wed, 23 Apr 2014 23:49:42 +0000</pubDate>
		<dc:creator><![CDATA[jthomas1]]></dc:creator>
				<category><![CDATA[Projects]]></category>

		<guid isPermaLink="false">http://departments.digipen.edu/ece/?p=31</guid>
		<description><![CDATA[Student: Matthew Kaes Direct Abstraction is a high level scripting language that brings a shared code base to both the PC environment and ARM hardware. In the case of the PC version there is an interpreter  that parses the user’s code in a text file and executes it. For the ARM device an embedded virtual machine will look for &#8230; <a href="http://departments.digipen.edu/ece/direct-abstraction/" class="more-link">Continue reading <span class="screen-reader-text">Direct Abstraction</span></a>]]></description>
				<content:encoded><![CDATA[<p>Student: Matthew Kaes</p>
<p>Direct Abstraction is a high level scripting language that brings a shared code base to both the PC environment and ARM hardware. In the case of the PC version there is an interpreter  that parses the user’s code in a text file and executes it. For the ARM device an embedded virtual machine will look for code on an SD card on boot up to parse and run. In both instances development time is improved by removing the need for long compile times and having to deal with complex tool chains in order to get the code to build. Since the code base is the same for both the PC and the ARM device this means that the developer can write code for the end device without ever running the code on the device. Code can be rapidly developed on a PC environment with instant turnaround time for testing and then the final code can simply be ported to the final device and work.</p>
<p><span id="more-31"></span></p>
<figure id="attachment_44" style="width: 663px;" class="wp-caption aligncenter"><a href="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/08/DA-code.png"><img class="size-full wp-image-44" src="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/08/DA-code.png" alt="Direct Abstraction IDE" width="663" height="463" /></a><figcaption class="wp-caption-text">Direct Abstraction IDE</figcaption></figure>
<p>The project was started as a proof of concept that a high level language can be fast enough, flexible enough, and powerful enough for real world use. Over the course of the project however, a new focus was given on the importance of a unified development environment and how absorbing all parts of the development cycle into a single system can greatly reduce inefficiencies in the development cycle.</p>
<p><a href="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2015/08/Kaes-Direct-Abstraction.pdf">Kaes, M. &#8220;Direct Abstraction&#8221; Digipen report, 24 pages, 2014 (PDF)</a></p>
<p><a href="http://departments.digipen.edu/ece/wp-content/uploads/sites/8/2014/04/DA-User-Manual.pdf">Kaes, M. &#8220;Direct Abstraction User Manual&#8221;, Digipen report, 17 pages, 2014 (PDF)</a></p>
]]></content:encoded>
			<wfw:commentRss>http://departments.digipen.edu/ece/direct-abstraction/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
