12/08/2013 - A new beginning... again
you know when you finish something and then you suddenly realise that there is a much much better way of doing things?
New plan means simpler genes, less complex inner workings and should allow greater connectivity inside the brain while still allowing "Clusters" of neurons to form which are less likely to mutate in a damaging way.
05/08/2013 - A new beginning
I have started the basic brain logic from scratch. Theoretically it does the same thing, just in a far more CPU friendly way, it should also lead to a less confusing graphical brain/neuron editor and be easier to integrate into future projects of mine (or yours). I've scrapped the old environment that I was working on, again because it was so inefficient, and I will make a better one in due course.
The new brain works on a single thread, though the idea that the brain is "event driven" has still stuck, no pointlessly updating information on 10,000 neurons if only 3 messages are being transmitted! I know that premature optimisation is evil but in this case I can safely say that the old brain was and always would have been a severe bottleneck!
05/12/2012 - Early progress and Sensor plans
Progress so far:
I have almost finished with the inner working of the brain, I simply need to add methods that can randomly mutate the BrainBlueprints' structure, these methods will be used when an organism re-produces so that it's offspring will have a chance of exhibiting new behaviours. These will most likely have no impact or cause it to be massively unfit, but that is where natural selection comes into play.
I am working now on the simulated Environment in which the Organisms are living and also on adding new senses to the Organism, the BinaryEye sense is already working as expected. Planned Senses are:
Taste buds - these will be located on the front of the organism and will trigger if the space in front contains what they are designed to taste.
BinaryEye - This has 5 Rod Sensors each facing a slightly different direction, if there is an object that is within 20 pixels and it is darker than it is light, any Rod Sensors facing it will trigger. (implemented)
MonochromeEye - This has 5 Rod Sensor Groups, all rods in a group point in the same direction, the darker the object is the more will fire.
PolyChromeEye - This has 5 Sensor groups, each group has 3 Cone sensor groups, A cone sensor is like a Rod sensor but instead it is specific to one colour, (Red, Green or Blue in this case), like in the monochrome eye, the more e.g. Blue an object is the more Blue Cones will be triggered, red Cones for red objects e.c.t.
Hunger - triggers if energy reserves are low.
Pain - triggers if Organism is damaged (Ethical issues here?)
BreedingWeightDetector - triggers if Organism has enough energy to reproduce.
Planned Effectors are:
Left and Right Fins - turns the organism to 90 degrees the left or right respectively when triggered. (implemented)
Tail - Moves the organism forwards one space when triggered. (implemented)
Herbivore Mouth - attempts to eat plant type food one block in front of the creature when triggered.
Scavenger Mouth - attempts to eat anything one block in front of the creature when triggered.
Carnivore Mouth - attempts to eat meaty food one block in front of the creature when triggered, damages anything in front of Organism at the same time.
ReproductiveOrgan - creates an new organism based on the parent, it places it behind the parent and facing away (to stop carnivorous Organisms instantly attacking their young or parents)
27/11/2012 - The big idea
This is my most recent and potentially my most ambitious idea.
AI has always fascinated me and it has always seemed obvious to me that if we want a human like AI we shouldn't be working to create it from the top down, creating higher functions like reading and learning first, we should build it from the basics upwards, as they have a fundamental effect on how we perceive the world, and everything in it.
I'm not going to pretend that I'll be able to evolve a human like intelligence but I'd like to make a proof of concept. I've written a set of classes which are capable of simulating a basic nerve net, it takes binary inputs and creates binary outputs. These inputs can represent anything you like and the outputs can too, I am making a small 2D environment and I will attempt to simulate animals searching for food.
There is a brain editor, this allows you to visually see the brain's neurons on screen and how they are connected, and it allows you to edit neurons and their connections too.
The project's GitHub, all code is open source.
WARNING: This version does nothing at all... I've started to re-write the basic logic of the brain and so far I have not had time to implement any visual functionality.
The latest: Neural Web.jar