Life is a game, take it seriously

Archive for the ‘brain’ Category

Training a Rap Machine

In AI, brain, deep learning, Machine Learning, Serious Stuffs on January 9, 2020 at 7:15 pm

by Li Yang Ku (Gooly)

(link to the rap machine if you prefer to try it out first)

In my previous post, I gave a short tutorial on how to use the Google AI platform for small garage projects. In this post, I am going to follow up and talk about how I built (or more like an attempt to build) my holiday project, a machine that completes your rap lyrics using the “Transformer” neural network.

Transformer is a neural network model introduced by Google Brain for mostly language related tasks. What is interesting about this architecture is that instead of taking one word at a time, it takes in the whole input sentence at once and learns the relationship between each word. This allows transformers to learn useful relationships such as what a pronoun refers to in a sentence. In the original paper “Attention is All You Need”, this ability to understand relations between words are referred to as attention since it has the ability to focus more on certain pairs of words. I will not go into the details of Transformer since quite a few people already explained it in great length in their blogs (such as this blog and this blog.) My rationale was that Transformer’s ability to learn relationships between words in rap sentences should allow the network to learn what words rhyme well together or have the right flow.

I collected rap lyrics from 14 artists with a total of around 180000 lines of rap. These rap lyrics are further filtered to around 50000 lines of lyrics that I considered to “rhyme” with another line. The first line is the input and the second line is the output. Pairs of these sentences are then split into training and evaluation (90:10 split.) The Transformer architecture I used is mostly based on this Google Cloud Platform tutorial on generating poetry. After a little bit of hyper tuning, I ended up with a Transformer architecture of 2 hidden layers, 4 heads (number of attentions), and 512 hidden layer size. This model is then trained for 75000 iterations. The total cost of experimenting with different models and trial and error cost me a bit more then a hundred bucks (note that you are given $300 to start).

The final result was, well, entertaining but with limited success. Occasionally it will rhyme if your sentence ends with a common word but most of the time its just a weirdo spitting out random sentences with a semi-rapper attitude. The following are a few example results:

Input: This thing I made raps like a machine

Output: It’s all fair in love with me

Input: It generates raps that run deep

Output: That’s what I mean

Input: No rap is free, this thing cost me a fee

Output: I got a whole lotta guap on me

Input: Consider this as a new year gift that raps

Output: Chillin’ on some wildin’ out of that

Input: You might say my input rap is bad

Output: I was gone but now I m back

Input: If you don’t like your output rap

Output: Where the fuck am I going back

You can try out the rap machine here yourself. Thank you all for reading this blog and wish you all an entertaining 2020!

Looking Into Neuron Activities: Light Controlled Mice and Crystal Skulls

In brain, Neural Science, Paper Talk, Serious Stuffs on April 2, 2017 at 9:50 pm

by Li Yang Ku (Gooly)

It might feel like there aren’t that much progress in brain theories recently, we still know very little about how signals are processed in our brain. However, scientists have moved away from sticking electrical probes into cat brains and became quite creative on monitoring brain activities.

Optogenetics techniques, which was first tested in early 2000, allow researchers to activate a neuron in a live brain by light. By controlling the light that activates motor neurons in a mouse, scientists can control its movement remotely, therefore creating a “remote controlled mouse” which you might heard of in some not that popular sci-fi novels. This is achieved by taking the DNA segment of an algae that produces light sensitive proteins and insert it into a specific brain neuron of the mouse using viral vectors. When light is shed on this protein, it opens its ion channel and activates the neuron. The result is pretty cool, but not as precise as your remote control car, yet. (see video below)

Besides the Optogenetics techniques that are used to understand the function of a neuron by actively triggering it, methods for monitoring neuron activities directly have also become quite exciting, such as using genetically modified mice with brain neurons that glow when activated. These approaches that use fluorescent markers to monitor the level of calcium in the cell can be traced back to the green fluorescent proteins introduced by Chalfie etc in 1994. With fluorescent indicators that binds with calcium, researcher can actually see brain activities the first time. A lot of progress have been made on improving these markers since; in 2007 a group in Harvard introduced the “Brainbow” that can generate up to 90 different fluorescent colors. This allowed scientists to identify neuron connection a lot easier and also helped them won a few photo contests.

To better observe these fluorescent protein sensors (calcium imaging), a recent publication in 2016 further introduced the “crystal skull”, an approach that replaces the top skull of a genetically modified mouse with a curved glass. This quite fancy approach allows researchers to monitor half a million brain neuron activities of a live mouse through mounting a fluorescence macroscope on top of the crystal skull.

References:

Chalfie, Martin. “Green fluorescent protein as a marker for gene expression.” Trends in Genetics 10.5 (1994): 151.

Madisen, Linda, et al. “Transgenic mice for intersectional targeting of neural sensors and effectors with high specificity and performance.” Neuron 85.5 (2015): 942-958.

Josh Huang, Z., and Hongkui Zeng. “Genetic approaches to neural circuits in the mouse.” Annual review of neuroscience 36 (2013): 183-215.

Kim, Tony Hyun, et al. “Long-Term Optical Access to an Estimated One Million Neurons in the Live Mouse Cortex.” Cell Reports 17.12 (2016): 3385-3394.