Saturday, 29 March 2014

Stage (13) Week Commencing (24.03.14)
Week (24)

Description

I had a good session this week as I managed to get into Uni, I got to work with my tutor in getting the jaw to function to move when processing outputted to applescript. There is still more work to be done and we installed a switch to stop the mouth moving on demand because we I have to work out how to make an audio trigger and wire that straight into the Arduino. The system works by a simple routine in arduino, the code adds random co-ordinates to specific angels for examples if the mouth servo goes over 90 degrees then add random 40 degrees and the same for the delay function this is to give the servos a more natural sporadic movement. The cut off switch works by using a simple switch mechanism that cuts the circuit and stop the arduino receiving information into the analog port, when no information is sent the arduino reverts to 134 degrees or mouth closed.

Equipment.

Bread Board
Wires
Transistor
Switch

(1.1, 1.2)

Arduino side


#include <Servo.h>
int c=0;
int talkmode=0;
int talkmode2=0;
long rfac;
long mpos;
long rfac2;
long mpos2;
Servo myservo;  // create servo object to control a servo
// a maximum of eight servo objects can be created
Servo myservo2;
Servo myservo3;
Servo myservo4;
int talkcount=255; //eventually use audio stop trigger
void setup()
{
Serial.begin(9600);
pinMode(A0,OUTPUT);
pinMode(A2,OUTPUT);
pinMode(A3,OUTPUT);
pinMode(A4,OUTPUT);
myservo.attach(A0);
myservo2.attach(A2);
// myservo3.attach(A3);
// myservo4.attach(A4);
}
void loop(){
while(Serial.available()>0){
talkmode=Serial.read();
}
if(talkmode==1){
rfac=random(100);
if(rfac<40){
// mpos=random(130);
mpos=95+random(40);
delay(20+random(30))
}else{
mpos=128;
}
int r=analogRead(5);
if(r<1000){
mpos=128;
talkmode=0;
}
}
if(talkmode==0){
mpos=300;
}
myservo.write(mpos);




Processing side





import processing.serial.*;

/*
A little example using the classic "Eliza" program.

Eliza was compiled as a Processing library, based on the
java source code by Charles Hayden:
htp://www.chayden.net/eliza/Eliza.html

The default script that determines Eliza's behaviour can be
changed with the readScript() function.
Intructions to modify the script file are available here:
http://www.chayden.net/eliza/instructions.txt
*/

import codeanticode.eliza.*;
Serial myport;

Serial myport2; // neck motor
int drawskeleton=0;  //1 / 0

Eliza eliza;
PFont font;
String elizaResponse, humanResponse;
boolean showCursor;
int lastTime;
PImage bg1a;


void setup()
{
size(1200, 786);
background(200,0,0);
//end si
bg1a=loadImage("bg1.jpg");
println(Serial.list());
myport=new Serial(this, Serial.list()[6],9600);
//myport2=new Serial(this, Serial.list()[??????],9600);
// When Eliza is initialized, a default script built into the
// library is loaded.
eliza = new Eliza(this);
// A new script can be loaded through the readScript function.
// It can take local as well as remote files.
eliza.readScript("scriptnew.txt");
//eliza.readScript("http://chayden.net/eliza/script");
// To go back to the default script, use this:
//eliza.readDefaultScript();
font = loadFont("Rockwell-24.vlw");
textFont(font);
printElizaIntro();
humanResponse = "";
showCursor = true;
lastTime = 0;
}

void draw()
fill(255);
stroke (111);
text(elizaResponse, 30, 450, width - 40, height);
fill(0);
int t = millis();
if (t - lastTime > 500)
{
showCursor = !showCursor;
lastTime = t;
}
if (showCursor) text(humanResponse + "_", 30, 600, width - 40, height);
else text(humanResponse, 30, 600, width - 40, height);
simpleopennidrawmethod();
 }
void keyPressed()
{
if ((key == ENTER) || (key == RETURN))
{
println(humanResponse);
//first scan for keywords
elizaResponse = eliza.processInput(humanResponse);
println(">> " + elizaResponse);
String[] out={elizaResponse};
saveStrings("/Users/macbookpro/Desktop/test.txt",out);
delay(10);
println(sketchPath+"/data/applescriptbridge.app");
open(sketchPath+"/data/applescriptbridge.app");
myport.write(1);
humanResponse = "";
}
else if ((key > 31) && (key != CODED))
{
// If the key is alphanumeric, add it to the String
humanResponse = humanResponse + key;
}
else if ((key == BACKSPACE) && (0 < humanResponse.length()))
{
char c = humanResponse.charAt(humanResponse.length() - 1);
humanResponse = humanResponse.substring(0, humanResponse.length() - 1);
}
}
void printElizaIntro()
{
String hello = "Hello.";
elizaResponse = hello + " " + eliza.processInput(hello);
println(">> " + elizaResponse);
}
}

Base Building

Equipment.

1 x 800mm Di cheq alumium place (1.3)
1 x 800 mm Di - 300 mm Dep Wooden Circular Cut out (1.4)
Black Spray Paint
Black guitar neck Plate

I built the base for the robot this week, I ordered the bits i needed off Ebay. To start with I matched the legs of the robots base up to the check plate and drilled small holes into it then i did the same for the wooden base but only particle holes not all the way through, these where to slot the very tips of the bottom base rods into them. I secured the aluminium plate onto the legs using bolts. I then drilled 3 additional holes into the Aluminium plate and stuck screws through the holes, i then screwed the screws into the wood base plate after I had sprayed it black with spray paint. I also found an old guitar neck plate that I thought i could use to cover up part of the neck framework to make it look more professional however this may also come in useful later on for attaching the neck framework for securely to the base (1.5, 1.6)




Feelings

I am happy with how the build is progressing again I am hoping that I will get the major parts of it finished in the next couple of weeks. I also think that the project is starting to look really interesting as people are stopping and having a look and talking to me about the project as they pass by my desk so I'm feeling pleased that people are naturally taking an interesting in the project. 

Evaluation

This session was a good experience, I am not 100% on how the code interacts from processing into Arduino using the audio switch but i will have an in-depth look into it when I have chance and properly annotate the code. I was pleased with the parts i had ordered because there is always a worry when ordering over the internet that things are not going to be but to the correct size but everything fit into place like it should.

Analysis

The current situation of the build does make sense to me I do know there are parts of the code I need to go over and give closer examination, but i am happy at where i am with the project and how it relates to the brief. I am really starting to see how the application of code is bringing life to the project and the more I learn about the interaction between processing and arduino the more I can see how useful it could be for future projects.

Conclusion

In conclusion, a lot has been achieved this session the project has taken a massive leap and is starting to come together well. I really like how the mouth mechanism works the action is really smooth, I need to test it out with the jaw L/R servo working at the same time to see what it looks like when both systems are working simultaneously to see if the same natural movement can be achieved. 

Action Plan

If I had to re-do this session I would write down notes from the time i spent with my lecturer, because I thought I could remember how the process worked in my head but a few days later I cannot remember it fully. Next session will be spent rigging up a digital audio trigger, I am also thinking that I may leave this circuit in but use it as a cut off switch for the whole project by flatlining all the servos.



1.1

1.2


1.3


1.4


1.5



1.6




 


Saturday, 22 March 2014

Stage (12) Week commencing (17.03.14)
Week (23)

Description

I am still having trouble affording travel to University so I have had to do a lot of work from home over these past couple of weeks. Although I did manage to make it in for last Thursday with a cheap day return ticket when I got into Uni I had major issues with my robot. The servo that powers the up and down neck movement blew up. I think It may have been knocked on the train when i was transporting it to Uni. It was pretty frustrating as I felt I had missed an opportunity to get on with the build with the help of my tutor. I ended up spending my time at uni changing out the servo for another one Luckily I had a spare Futaba servo with me however this is not a metal gear servo it is a plastic gear servo and did not have enough torque to power the neck mechanism. I decided that this servo may be more suited for the up and down jaw servo. So i took the (Mg996r) out of the jaw mechanism and replaced it with the Futaba servo and used the (Mg996r) for the neck mechanism. This was actually a good move because the Futaba servo is more reactive than the metal gear servo and is more suited for the jaw movement because of its high speed ratio, however the (Mg996r) is not powerful enough to move the head up and down via the neck mechanism. I will save up and get a servo with a higher torque ratio of this mechanism.

I have installed a small speaker into the top jaw of the robot, the speaker faces down into the mouth with the idea being that when the robots jaw opens and closes the sound reacts with this movement. The speaker is 3w but is pretty loud it is rechargeable and as an 8 hour battery life (although i have not tested this out) I also purchased a long speaker lead that I can connect to the laptops audio output. I had to get a lead with a right angle connector because a straight connector stuck out and the lead interfered with the eye mechanics. I used milliput to secure the speaker in place it also gave the sound direction because it stop sound escaping from the sides of the upper jaw piece. (1.1)

In addition I have also been working on getting all the co-ordinates for all the motors I have so I know exactly what co-ordinates I have to work between here is a list of those points.

JAW

Bottom Servo

up - 130°
down - 90°

Top Servo

left - 76°
right - 86°
mid - 80°

BOTTOM LIP

out - 65°
in - 100°

EYES

Top servo

up - 173°
down - 0°

Bottom servo

left - 0°
right - 180°

TOP LIP

Left Servo

up - 65°
down - 110°

Right Servo

up - 70°
down - 20°

HEAD

Mid Point - 85°

(1.2)

Feelings

I am feeling great about the project but not being able to attend uni regularly is really taking its tole on me, I am constantly on the phone with student finance trying to get them to pay me what they owe me then I get promised it will be sorted out in 7 days and It never is I just get more letters demanding information they already have.

Evaluation

This session was a good experience the link between the computing side and physical construction is now a lot strong thanks to the audio out-put. 

Analysis

The current situation is frustrating all I want to do is get on with this project but without the funding I need progress is slowing coupled with malfunctioning servos. I hope that this gets sorted soon and I can get back on track.

Conclusion

I do not think there is much else I could have done for this session, I have also started another project which is my special study into A.I so at least I can carry on with that while I'm waiting for student finance to pull there heads out of their asses. 

Action Plan

I am going to look at putting a solid base together maybe using thick metal to add weight to it because the robot moves around a lot when the servos are in function, I will research this for next session.



1.1



1.2



Additional:

I have been thinking about how I can transport the project safely around without damaging any parts of it. What I have come up with is a way of connecting the bottom lip brass connector under the edge of the second base ring by. this would make it easier to transport via reducing the size of the rig and also put less strain on the neck motor. This then will fit into a cardboard box and into a padded holdall.





Wednesday, 5 March 2014

Stage (11) Week Commencing (03.03.14)
Week (21)

Description


This blog is a roll over from Fridays session at University where I got help from my lecturer with the applescript code here is the outcome of that session. Applescript is an unfamiliar script for me and although it was indemnified at the start of the project as the most suitable process to output the voice script into I still needed help in exporting the app to do this.


To start with we had to alter the speech out put and make a bridge from applescript to processing the code looks like this.

set theVoices to {"Alex", "Bruce", "Fred", "Kathy", "Vicki", "Victoria"}

set thePath to (path to desktop as Unicode text) & "test.txt" // set path to desktop

set the_file to thePath //set file to the path

set the_text to (do shell script "cat " & quoted form of (POSIX path of the_file)) 

set the clipboard to the_text

set theSentence to the clipboard

log (theSentence)

say theSentence using ("Victoria") speaking rate 104 modulation 15 pitch 8

on readFile(unixPath)
return (do shell script "cat /" & unixPath)
end readFile


         We set the path to desktop / test.txt outputted the code as an application and placed the application inside the data folder in Eliza and using processing we could send the output string from Eliza into the clipboard test.txt, then opening the bridge application and using the voice function in applescript make the script audible and outputs them to the pc speakers. I changed the modulation and speaking rate and pitch tomake it sound more melancholy.

CODE:
import codeanticode.eliza.*;
Eliza eliza;
PFont font;
String elizaResponse, humanResponse;
boolean showCursor;
int lastTime;
PImage bg1a;

void setup()
{
size(1200, 786);
bg1a=loadImage("bg1.jpg");
// When Eliza is initialized, a default script built into the
// library is loaded.
eliza = new Eliza(this);
// A new script can be loaded through the readScript function.
// It can take local as well as remote files. 
eliza.readScript("scriptnew.txt");
//eliza.readScript("http://chayden.net/eliza/script");
// To go back to the default script, use this:
//eliza.readDefaultScript();
font = loadFont("Rockwell-24.vlw");
textFont(font);
printElizaIntro();
humanResponse = "";
showCursor = true;
lastTime = 0;
}

void draw()
{
image(bg1a,0,0,width,height);
//background(102);
fill(255);
stroke (111);
text(elizaResponse, 30, 450, width - 40, height);
fill(0);
int t = millis();
if (t - lastTime > 500)
{
showCursor = !showCursor;
lastTime = t;
}
if (showCursor) text(humanResponse + "_", 30, 600, width - 40, height);
else text(humanResponse, 30, 600, width - 40, height);
}
void keyPressed() 
{
if ((key == ENTER) || (key == RETURN)) 
{
println(humanResponse);
//first scan for keywords               
elizaResponse = eliza.processInput(humanResponse);
println(">> " + elizaResponse);
String[] out={elizaResponse};
saveStrings("/Users/macbookpro/Desktop/test.txt",out);
delay(10);
println(sketchPath+"/data/applescriptbridge.app");
open(sketchPath+"/data/applescriptbridge.app");
humanResponse = "";
else if ((key > 31) && (key != CODED)) 
{
// If the key is alphanumeric, add it to the String
humanResponse = humanResponse + key;
}
else if ((key == BACKSPACE) && (0 < humanResponse.length()))
{
char c = humanResponse.charAt(humanResponse.length() - 1);
humanResponse = humanResponse.substring(0, humanResponse.length() - 1);
}
}
void printElizaIntro()
{
String hello = "Hello.";
elizaResponse = hello + " " + eliza.processInput(hello);
println(">> " + elizaResponse);
}


Here is an example of the voice output from Processing to Applescript



Feelings

          I am feeling very positive after the session with my tutor am I really impressed with how the voice adds another level to the build and I'm just really happy with it all at the moment.EvaluationGoing into uni and having a one on one session with my tutor has been a massive boost what would have taken me days to figure out was done and explained to me in a couple of hours.

Analysis

          My tutor made it clear to me how the process works and I feel like I have a good grasp on what processes where implemented and how he used them to create the outcome it made sense to me and I was happy experimenting on my own with some of the functions in applescript.

Conclusion

          In conclusion this session was extremely fun entering and interesting I would not do anything different I did not feel like i had to make loads of notes to make sense of the processes involved and I'm happy with the outcome.

Action Plan

          Next session I want to concentrate on getting the mouth moving in time with the audio out-put however because of on going finical issues I may struggle to get into uni for a couple of weeks.