Metex Ms 9150 Manual Muscle
ALASKA MACHINE SYSTE MS PRIVATE LIMITED. ASH MUSCLE & FITNESS PRIVATE LIMITED. INDINON PUBLICATIONS PRIVATE LIMITED. INDIRA BUSINESS (INDIA) PRIVATE LIMITED.
Assembled device with the top cover removed Step 1: First application Node.js, Git and Python 2.7 are installed on each LTPS by default, but NPM is not. For installing, connect it to your device with SSH and do the following: # Setup package repositories smart channel -add 150a type=rpm-md name='LTPS all' baseurl=-y smart channel -add 150c type=rpm-md name='LTPS cortexa' baseurl=-y smart update # Install NPM smart install nodejs-npm -y. You can see every event as soon as it's sent to IoT hub.
Step 5: Configure Stream Analytics Note: To complete this part of the tutorial you'll need an active Microsoft Power BI subscription. Before the information can be delivered to Power BI, it must be processed by a Azure Stream Analytics job. Choose New Internet of Things Stream Analytics Job. Configure the Job, then click 'Create'. Wait a couple of minutes until the Job has been created, then select All resources name of your Stream Analytics Job.
Click Inputs Add Set up input in the following way:. Input alias: data-from-tps.
Source Type: Data stream. Source: IoT Hub.
Subsriptiopn: Use IoT hub from current subscription. Endpoint: Messaging. Shared access policy name: iothubowner. Consumer group: powerbi. Click 'Create'. It doesn't have to be silicon all the time. I have a Gallium-nitride device here at the Cumps lab that I'm going to try out.
In some high voltage, high power designs, GaN FETs have advantages over Si. On the other hand they are also more difficult to drive. The chip that I have contains a built-in smart GaN FET driver.
That takes away the complexities of driving the power stage correctly. We're covering fairly new technology here. The documents are still marked technology preview. I received them from TI after attending a GaN seminar and answering right on the quiz. In this post I'm doing a first try-out. I'm powering a 12V 1.25A load. What's in the package I'm using a TI LMG5200 GaN Half-Bridge Power Stage.
That's a fancy title for what is in essence a 2-FET half-bridge setup with driver logic in a single package. The driver stage takes care that the not-so-easy to-drive FETs are kept within their safe operation boundaries. Unlike many integrated smart half and full bridges on the market, you can control the output of the high and low side separately. Taken from the technology preview document lmg5200.pdf.
These are my very first steps, so I'll focus on trying the driver in this post. I'll cover specifications and technology in a follow up. TI's GaN overview page, has some good info on the technology. It's vendor biased info, but I bet you're smart enough to filter that out. Starting Up the Half-Bridge I'm using the package on an evaluation board. In essence, that board is an implementation of the reference design (all rules of the ' document have been applied).
The only added functionalities are the provisioning for a stable bias supply, an output low pass filter and some logic to generate the high and low driver signals from a single PWM signal. That evaluation board by itself is good for a few blogs. The Start Up sequence You need to power up the device in a particular sequence. First you need to bring up the bias voltage.
Then you connect the PWM signal. Last step is to bring up the input power source. You shutting down in the reverse order. The Set Up My source for power and PWM - and the frequency counter and output voltage meter - is the old trusted (by many untrusted). The 6 V bias is delivered by it's variable 0-30 V power supply. 20 V input is generated by wiring the fixed 5 V and 15 V supplies in series. The PWM signal (5 V, 100 kHz - 2 MHz) is coming from its function generator TTL output.
I'm checking the frequency with its universal counter. And I'm verifying output volmtage with its built-in DMM. I've connected a Rigol DS1052E to the PWM input and the output. The load is an incandescent 12V 15W bulb. As expected, everything works fine.
Any other outcome would be a surprise. The task that I'm giving to the NaG driver is very well within its operating range. The whole setup stayed nicely under control and nothing got hot on the board. Below are the capture of DC and AC analysis of the output signal. Don't get carried away by the ripple on the AC signal. That's just fine.
We're not testing a regulated DC power supply here. This is a PWM signal that's filtered by a coil and some caps. It will perfectly manage the power sent to the load.
DC AC The initial exercise was a success. I'm now confident that I can properly drive the GaN IC, and I can start probing a bit deeper under the surface. Related Posts.
It doesn't have to be silicon all the time. I have a Gallium-nitride device here at the Cumps lab that I'm going to try out. In some high voltage, high power designs, GaN FETs have advantages over Si.
On the other hand they are also more difficult to drive. The chip that I have contains a built-in smart GaN FET driver. That takes away the complexities of driving the power stage correctly.
We're covering fairly new technology here. The documents are still marked technology preview. I received them from TI after attending a GaN seminar and answering right on the quiz. In this post I'm probing the switching output and checking the dead time between switching transitions. The Switching Node Output Taken from the technology preview document lmg5200.pdf. The switching node (pin 8, SW in the schema above) is the output tab between the upper and lower half of the half-bridge. It's the power output signal - driven high and low by the two output FETs to generate the final PWM.
Between the transitions there has to be some dead time - a time when it's guaranteed that both sides of the bridge are shut off. To allow fast switching times, it's key to optimize the dead time. It should be long enought to ensure stability, but no longer. In an optimal configuration, one side of the bridge is always on and one side off.
Dead time eats away from that optimal situation and limits the maximum PWM frequency. What is Dead Time? Definition from (this applies to GaN modules too): In order to avoid bridge shoot through it is always recommended to add a so called “interlock delay time” or more popular “dead time” into the control scheme. With this additional time one IGBT will be always turned off first and the other will be turned on after dead time is expired, hence bridge shoot through caused by the unsymmetrical turn on and turn off time of the IGBT devices can be avoided. The GaN driver that I'm using here, where the driver logic is built into the same package as the GaN power FETs, allows for dead times smaller than 10 ns. On my evaluation board, the dead time is configured to be approx 8 ns. Let's probe that!
Test Setup The gives a detailed overview of how to do these measurements. We're talking about fast edges here, so our test setup has to be done right to avoid introducing parasitic inductance via our probe. Ever wondered why there's a spring in the accessories bag of your scope? It's there to do measurements like this, at frequencies where the ground wire of the scope can act as an extra inductor or as a noise antenna. The ground spring minimizes that loop. I've measured two signals: the dead time when switching the output from low to high, and - just for kicks - the rise time of the output PWM.
I'm measuring a dead time of 7.2 ns. Almost the 8 ns specified in the doco. The rise time - measured just before the output LC filter, with 1 amp output load, 50% duty cycle, is 3.8 ns. We're dealing with a fast switcher here.
That opens new opportunities for small high-power switching. But it's also going to cause some headaches on the EMF compliance front. Related Posts. It doesn't have to be silicon all the time. I have a Gallium-nitride device here at the Cumps lab that I'm going to try out. In some high voltage, high power designs, GaN FETs have advantages over Si. On the other hand they are also more difficult to drive.
The chip that I have contains a built-in smart GaN FET driver. That takes away the complexities of driving the power stage correctly. We're covering fairly new technology here. The documents are still marked technology preview. I received them from TI after attending a GaN seminar and answering right on the quiz. In this post I'm building a PWM deadband generator with a Hercules LaunchPad. PWM with Deadband To drive our GaN half-bridge, we need two complementary PWM signals.One to drive the upper FET, the other to drive the lower FET.
When one is high, the other has to be low. The worst thing that can happen to a half-bridge is that both FETs are conducting. So we have almost covered that. Almost, because we need more. For a tiny amount of time during switching, we have to take care that none of the FETs are on - the Dead Time or Deadband.
For an explanation of Dead Time, check the. My goal in today's blog is to create that signal with a Hercules LaunchPad PWM module. This is the signal we're after - the hatched areas indicate where both signals (and both FETs) should be off: Each PWM module of the Hercules controller has a pair of outputs (A and B) that are tied together. They act together, and can be used to generate tightly coupled signals.
Metex Ms 9150 Manual Muscle Oil
By default, the B signal is the inverse of A. When A is high, B is low and vice versa. But you can change that. One of the things you can change is introduce a delay in the rising edge and the falling edge. That's what we're looking after.
Both signals should have a delay on their rising edge, so that that edge only happens at a given time after the counterpart's falling edge. If we manage to do that (and we will ), we have what we need. A set of signals that are each other's compliment, but have a gap between one switching off and its partner switching on. Two Lines of Code We configuring the PWM module with HALCoGen, the visual setup application.
Our first task is to enable drivers, pin multiplex and timer settings. Driver: Mux: Additional clock Mux config In the PWM tab, we enable the first PWM pair, and set the delays.
Setting the delays needs some more explanation. We have access to two delays, a rising edge and a falling edge delay. And each of them can be used for one of the two signals, but not both. So we can't just define a rising edge delay for both channels.
Here's how we get around that situation: - For the A signal, all is simple, we define the rising edge delay, and route the A signal through that: - For signal B, it's a bit more tricky. Because we've already used the Rising Edge Delay for signal A, it isn't available for B.
But we still have the Falling Edge Delay available. And we can invert our signal at the begin and the end of its route.
So if we invert signal B by selecting High Polarity (and it becomes identical to signal A at input), then enable Falling Edge Delay, and invert the signal again at the output (Invert Polarity checkbox), we have sneakingly achieved what we wanted. The Falling Edge becomes the Rising Edge after the last inversion, and we now have two opposite signals, each with their Rising Edge delayed. The Result On the oscilloscope, you can see the output of both PWMs. I've added a math A+B signal to show that there's a time where both signals are low.
The Deadband. 3: Bart Simpson looking over the wall In this article, I've exaggerated the times to make everything more visible. To drive the LMG5200, we'd choose delays around 6 or 7 ns.
All parameters are runtime configurable. So we can change PWM frequency, duty cycle and delays from firmware. Each can be changed without impacting the other parameters. Full control, anyone? And as promised, the 2 lines of code needed to make this work: #include 'HLetpwm.h' etpwmInit.
Results The system will start up with the lamp fully dimmed. You can control the brightness by rotating the encoder. Use the current meter to check the current coming from the power supply, and the current going to the lamp.
Check how it changes when rotating the scroll wheel. Attach the voltmeter and check how the output voltage changes. Attach your scope to the PWM signal. If you have a math function, you can see the duty cycle change and the frequency staying stable.
Related Blogs The CCS / HALCoGen projects are attached to the blog. I'm picking my Arduino and back up to do development on it, my Linux laptop recently died so I'm reviewing Windows solutions, I've happened upon the following so far:. I've discovered that with the Arduino IDE some Arduino compatible hardware is now flagging behind, forcing me to use older versions of the IDE because their hardware profiles don't match the newer layouts which is causing problems. I might be able to get this working with cygwin in windows (I mention that and a thousand linux users shudder) but since Linux isn't on my main hardware atm this isn't much of an option. for / Visual Studio is now mostly free, at least for the Community edition, as is Atmel Studio (I think?). Visual Micro also has a Pro/non-pro counter part but getting the arbitrary Arduino compatible versions working encounters similar issues to the Arduino IDE with its hardware profiles. Which do you use/prefer or would recommend?
I suppose I could use a virtual machine and go back to linux. Or there's the other option, code the chip directly. The United States boasts some of the most impressive research universities in the world, many of which have earned a reputation as hotbeds of innovation. Funding from private companies and federal agencies has helped entice thousands of engineers to these schools in an effort to uncover the manufacturing industry’s next breakthrough technology. In 2014, Walmart awarded seven leading research and development institutions to uncover new ideas and create jobs that would help boost U.S.
In 2015, the U.S. Department of Defense launched a $150 million designed to encourage manufacturers, universities and non-profits to develop a manufacturing hub focused on revolutionary fibers and textile technologies. With the ability to attract global innovators and the funding to support them, America’s research universities have been responsible for a number of exciting advancements in manufacturing. Here are three of them.
Purdue University, Indiana: Sinuous flow Researchers at have discovered, a type of metal deformation that could change the way manufacturers cut metals in the future. While using high-speed microphotography to analyze the results of cutting ductile metals, researchers found that metal forms thin folds instead of breaking off the same way each time.
Further testing revealed that since ink traditionally used to mark metals sticks very well, it can significantly suppress the folding behavior brought about by sinuous flow. As a result, applying the standard marking ink may reduce the amount of energy needed to cut metals by up to 50 percent.
In addition to improving machining efficiency and the surface quality of metals, using less force will also limit costs by increasing the longevity of cutting tools. Northwestern University & University of Illinois: Pop-up 3D printing While the potential of 3D printing within the manufacturing industry has shown great promise, the innovative technology does have its shortcomings.
Difficulty adapting to a wide variety of materials coupled with less-than-stellar speeds have limited the impact of 3D printing to date. Those constraints, however, may soon be a thing of the past thanks to researchers at and the. Inspired by a pop-up Christmas card sent to one of their team members, these researchers began to consider the idea of building 3D-printed objects from a flat surface. Implementing the same technique as the pop-up card, they placed small cuts and indentations in plastic objects that would “pop-up” when pushed together to form different shapes. What makes the design so unique is that flat surfaces can be made quickly from just about any material available. This added flexibility may eventually lead to the creation of pop-up shelters that are shipped flat to natural disasters sites and then easily assembled upon arrival. There is also the possibility that the process could be used in tissue engineering to help mold new tissue to a specific shape and size.
University of Texas: Wearable health patches Researchers at the have invented a manufacturing method that can create disposable health-monitoring patches. Similar to a tattoo, these thin wearable devices collect data on a user’s vital signs, hydration level, muscle movement, temperature and even brain activity. Although such devices have drawn interest from consumers eager to maintain a healthy lifestyle, an extensive and costly production process has severely reduced their chances for success. By cutting manufacturing time from several days to 20 minutes, the repeatable “cut-and-paste” method developed by researchers at UT Austin promises to boost those odds. The breakthrough process also does not require a clean room, wafers or advanced equipment, making production relatively inexpensive. Have you worked or studied at a US research University?
What were your impressions of the experience? Which institutes - in the USA or worldwide - do you think deserve more recognition for the research work they're currently doing? Share your thoughts in the comments section below. Hi, I would like to try cmsis on a real system, but I don't see where to start. I pulled the cmsis-plus xpack project from github but it looks like it doesn't have any processor specific code. I would like to know what needs to be done to use the system on a real microcontroller (in my case, STM32L0 family, but any example would be useful). For example, what do I need to implement to be able to make the system go into the right sleep mode when idle?
So is there an example project or some documentation describing what to do? Thanks, Armin. This is my first Pi, and while I have a background in microprocessor technology, I find myself stumbling through this setup. I have a Pi 3B with a 3.5' touchscreen plugged in. I have a USB mouse and keyboard installed and headphones plugged in as well.
I am using a SanDisk 16GB SD card that I loaded with the 2016-05-27-raspbian-jessie.img file. I have not connected an HDMI display, as I was hoping that it would work with the touchscreen or be accessible from the Win7 desktop I have it connected to for power(once the Pi booted up). I read that it takes a long time for the first boot, but how long? I'm over an hour now, and I still occasionally have the green led flash on and the red one off for a second. My touch screen is lit up, but nothing is displayed on it.
I have an explorer window up, but no sign of the Pi being recognized by my desktop. Is it taking longer because of the size of the SD card, and it has to reformat it? Are there any error codes for the flashes that I should be aware of?
(there doesn't seem to be any pattern at the moment, the green occasionally flashes on, and on an even longer period the red one occasionally flashes off). I'm supposed to shut this down 'cleanly', but how do I do that if it never comes up correctly? Obviously I will not get an answer before I have to power this down for the night, but when I do (assuming it never comes out of it's current state of setup) will it damage the Pi in any way, or just possibly the SD card? Should I take the touchscreen off, hook up an HDMI monitor and try again?
Or should I blank the SD card, reformat it, and reload the image file in case it was corrupted from this first attempt? Should I start with NOOBS? I didn't because after reading, I felt that I would just end up with raspbian-jessie anyway, so why hassle with the 'middleman'. I had no issue using Win32DiskImager to load the image file onto the SD card, but maybe I should have started with NOOBS anyway. Any help would be greatly appreciated, as I stumble through learning all of this.
Metex Ms 9150 Manual Muscle Diagram
Thanks:-).Update. After nearly 2 hours of sporadic flashing, but absolutely nothing else, I pulled the USB power cord. While powered down I removed the touchscreen and plugged in an HDMI monitor. On power up, it booted and went immediately to a desktop. It must have finished its initial setup the first time.
Anyway, the Pi works and now I need to learn what I can do with it. Not sure about the touch screen, but will try again after I learn more about the Pi. Just thought I'd update for anyone who might have a similar issue. Following on from last week's discussion on our members', this week we'd like to hear about some of the more regrettable chapters of your careers. Whether it was a bad experience with an employer, a project that went badly off the rails or something your heart wasn't really in, use the comments section below to tell us about the worst working experiences of your career. How did you find yourself in the role in the first place?. What made it stick out as a particularly negative or unfulfilling experience?.
Were you able to resolve the issues, or did you simply have to walk away?. Were you able to take any positives from the experience?. What would you do differently if you found yourself in a similar situation today? Now, we don't want to be responsible for any lawsuits, so be mindful to avoid 'naming and shaming' if discussing a specific employer. We'll be looking for the best, funniest and most interesting answers for a future feature.
Hello Challengers! We are coming into the final stretch of the Design Challenge! It's hard to believe that there are just ten days left before the final submission date. Please make sure to have your final projects submitted by 11:59 PM GMT on 29 August 2016 to be considered for the phenomenal prizes. There has been some reports that the replacement SenseHats and other missing parts were not received by our challengers. There was an unforeseen and unknown (until this week) problem with those products being delivered. They have arrived back to me today, and I will work on getting these shipped to you ASAP.
They must clear our compliance department before receipt. Despite this delay, we have made the decision to stick to the original close date. We will ship out the replacement SenseHats and missing parts to those who have shown progress on their project over the last few months. Thankfully this is the majority of our Challengers. Great job keeping up with the challenge, and helping each other along the way. We are very pleased with the efforts thus far, and look forward to a spirited dash to the finish line! Have a great weekend everyone!
Dave Our Challengers (purley for the sake of using our handy @mention feature to alert them). A Linux Based Image Detecting Security Camera Traditional security camera solutions are very good at detecting movement however these solutions are unaware about what that movement actually is. They cannot automatically tell the difference between trees moving in the air, or a car or a human. We wanted to have a system that could identify humans and record such information; this would allow it to discriminate between uninteresting activity (such as a car driving past a road) and more interesting activity such as the presence of a stranger. Thus the HAL-CAM 9001 was born. The camera doesn’t just output a video file; it also outputs a text file or table that contains time-stamps to indicate exactly when a human was spotted.
Others may wish to identify other things such as (say) a cat. The idea is to have a searchable and readable text file that a user can quickly browse for any interesting activity.
This tutorial describes the build process. This part covers the hardware in order to have an outdoor-ready HAL-CAM 9001. Check out the 90-second video summary (it is followed by a few minutes of video showing some of the construction that we recorded). What is this project about?
The UK has always had a tradition of embracing the security camera. When I went to college I remember my apartment still having a black-and-white video entry system containing an image tube such as a Vidicon. This was despite the Charge Coupled Device (CCD) having been invented several decades previously; testament to how rugged and reliable valve and tube technologies could be. Fast-forward to 2016 and there are millions of security cameras scrutinizing the public and a typical conversation during a coffee break with a colleague could well begin with “so what security cameras do you have?”; It was just this type of conversation that led to my colleague Aminder and me discussing the latest in home IP camera technology (IP cameras use a network to transfer video data) and how he was using a Raspberry Pi to operate it.
We wanted to build our own low-cost open source IP security camera solution suitable for outdoor use. HALCam: A joint project by Shabaz and Aminder The higher performance Raspberry Pi 3 offered some interesting possibilities because it means we can run software that ordinarily needed more computing power than was possible in a low-cost outdoor unit. The Linux operating system and open source projects allow for features to be implemented that today are not present in typical home cameras, in particular the image recognition feature. We think such a system is potentially more useful than a traditional camera because there is no need to perform a video search; it is easier to just hit Ctrl-F and search for text in a file. Also the more we thought about it, the more we realized that what we actually wanted was a general purpose outdoor computer that could be used for other functions too. For instance if the camera is outdoors then it may as well do things like measure the outdoor temperature, or in the future open electric gates, switch on the lighting when people arrive and so on.
The Pi 3 does have several built-in wireless technologies that could be useful for this. Basic Tools The main aim of the hardware design is have an outdoor-capable weather-resistant camera/computer. To build it we used a weatherproof plastic case and fitted everything using nuts/bolts and using acrylic or Perspex sheet as our support material inside the case to hold everything in place. The design could be replicated exactly or with modifications to suit the materials and equipment that is available. For example a 3D printer could be used instead of cutting acrylic parts.
In general some basic parts and tools that will come in handy are listed below. M3 (3mm) screws short and long. 3mm thick acrylic sheet (around 300x200mm is more than enough).
5-minute 2-part epoxy glue (such as Araldite Rapid). Cutting tools, clamps. Electric drill and drill stand (if possible), soldering iron. Eyewear because lots of drilling and cutting is involved Construction Overview In essence HAL-9001 consists of a computer (the Pi 3), the camera module and a power supply.
There are therefore three main steps to building HAL-9001;. Building a Pi Subassembly. Building a Camera Subassembly. Building a Power Supply Board There are also miscellaneous operations that need to be performed to fit and connect everything into the enclosure, such as drilling holes, fitting the heat sink and cable gland, attaching wires and so on.
These operations are varied and will be discussed throughout this tutorial. Finally there are some cosmetic operations that are optional such as painting the enclosure. Refer to the diagram here for the terminology that will be used in this project. Building the Pi Subassembly The Pi Subassembly consists of a Pi 3 attached to an acrylic (Perspex) sheet called the Pi Mount using plastic spacers and a shaped heat pipe in-between. The photo here shows it upside-down. When placed inside the case, only the underside of this will be visible. The Pi Mount piece of acrylic is used to secure the Pi into the case using two screw holes.
It also serves another purpose; it acts as a support for a heat pipe. A heat pipe is a sealed metal tube which has channels inside that pass thermal energy using a fluid. It is needed because the Pi can get hot when running intensive software.
This is particularly the case when ARM ‘NEON’ computer instructions are executed repeatedly, and this is the case for one of the software libraries which is needed for this project. It is essential to keep the Pi cool inside the enclosure. Our strategy was to rely on passive air-cooling using a heat pipe and a heat sink but if you’re in a hot country this might not be enough and you may need to also consider installing a small fan. The idea behind this solution is to use a Raspberry PI as a server that ensures the communication between different devices in the house and the user through smartphones, internal web server, motion and voice control.
As a base the Raspberry is working through and C application that acts as a server and the devices are made based on Atmel microchips programmed to interpret the sensor data and the messages from the server. The communication is made through I2C and I am working at the moment on a RF communication protocol that uses the same idea as the I2C protocol. The design of the system is modular and this makes it possible to connect new devices to the I2C line and with minimal changes to the server application the system will know how to control them and how to display them to the user. In this case the Atmel Chips will work as a driver and the Raspberry PI can be updated to do new things through sending messages to the devices. A simple application that turns on/off the lights can be updated to have a times or to react to different environment changes. One more challenge of the project is to hide all the devices so that they can be used only through the user applications without removing the normal interaction that the human is accustomed with.
For example the light switch will look the same and it will have the same switches with the same functions but there will also be some relays in the back that can control the light. By this we can assure that the user will have time to adjust to the new way of interaction, that a guest for example will still be able to switch on the lights and most of all, that the light can still be turned on even if there is a problem with the system. The system also has a database that logs all the changes and through it it can notify the user if something happens, or simple provide him with the information about the current state of things ( if the light was left on, if the window is opened, if the heater is on, etc. ) An alerting system can be easily integrated with it and notify the user if something is wrong inside the house. Devices At the moment I have the following devices I am planning or working on: Light Control The idea was described above and it is very simple. The system should be able to turn on/off the lights in different rooms and to sense if the user does it manually. Temperature Control This is based on the way a gas heating system works to heat up the house in the winter.
The system should control each heater and the boiler to preserve as much energy as possible and to keep the temperature in each room constant. The user will be able to choose the temperature for each room separately or for all of them together. The system based on the threshold provided will check the temperature in each room and if it reaches the threshold it will stop only the heater in that room. This can be done by using an electronic valve or by modifying a manual valve and using a solenoid to push it in place so that the water flow stops. In case the temperature is the required one in all the rooms the system will then turn off the boiler. Coffee machine Not something new but something useful, any coffee machine can be changed so that it can be started and stopped from the phone or to have a scheduler to start if at different hours.
Same as the light system, the device will also provide the user the current status and he will be able to stop the coffee machine if he left it opened when he left the house. Laundry notifier The idea is to weight the laundry basket and to notify the user if it reaches a weight that can be used for a normal washing cycle. Also the containers with detergent can be weighted and the system can tell you at any time how many washes do you still have left based on the current weight and the weight of detergent usually used for a washing cycle.
Doors and Windows If there is a storm outside you always wonder if you closed all the windows. The system should be able to check the current status and to open/close the doors/windows remotely. I should be able to provide the status with a hall sensor and to use a crank powered by a motor to open/close them.
TV Control This device is based on an IR led that acts as a remote. The idea is to choose from the phone what channel or what show you want to watch and for it to play it automatically. The system should also recognize the image on the screen for it to be able to choose a show from HBO GO for example or a certain movie from the media server. Power consumption statistics and control A separate power consumption sensor on every electrical outlet can provide good data for me to limit the power consumption. Adding a relay to control that outlet I can switch off devices I do not use when I am not at home and I cam check if I left something on that I should not have. The software can be later changed to notify me if for example the electric oven was left on more than usual. Personalized home welcoming I would like to have the light open and the music to start when I get home.
I could use a RFID tag glued on the key to know who unlocked the house and make different setups for everyone, like different actions, different lights and different playlists. One person can leave messages to another and the system will know which message to play. Controlling the curtains You always forget to close the curtains before you start watching a movie during the day and it would be nice to be able to do it from the sofa. A simple curtain that opens when you pull a string has the mechanism design needed. The only challenge here is to add a motor to the system so I can control it remotely. Using a light sensor I would be able to make an automatic system that controls the curtains so that the rooms are kept as cool as possible during the summer. The light sensor can also provide information to the central system on the Raspberry and open the lights if I am in the room and it gets dark outside.