Showing posts with label Human Brain. Show all posts
Showing posts with label Human Brain. Show all posts

Friday, March 8

Signal Detected in Human Brain


Scientists have identified a unique form of cell messaging occurring in the human brain, revealing just how much we still have to learn about its mysterious inner workings.


Excitingly, the discovery hints that our brains might be even more powerful units of computation than we realized.


Back in 2020, researchers from institutes in Germany and Greece reported a mechanism in the brain's outer cortical cells that produces a novel 'graded' signal all on its own, one that could provide individual neurons with another way to carry out their logical functions.


By measuring the electrical activity in sections of tissue removed during surgery on epileptic patients and analyzing their structure using fluorescent microscopy, the neurologists found individual cells in the cortex used not just the usual sodium ions to 'fire', but calcium as well.     READ MORE...

Friday, December 15

Supercomputer that Simulates Entire Human Brain


A neuromorphic supercomputer called DeepSouth will be capable of 228 trillion synaptic operations per second, which is on par with the estimated number of operations in the human brain

A supercomputer capable of simulating, at full scale, the synapses of a human brain is set to boot up in Australia next year, in the hopes of understanding how our brains process massive amounts of information while consuming relatively little power.  READ MORE...

Sunday, April 23

Human Brain Detects Signal


Scientists have recently identified a unique form of cell messaging occurring in the human brain that's not been seen before.  Excitingly, the discovery hints that our brains might be even more powerful units of computation than we realized.


Back in 2020, researchers from institutes in Germany and Greece reported a mechanism in the brain's outer cortical cells that produces a novel 'graded' signal all on its own, one that could provide individual neurons with another way to carry out their logical functions.


By measuring the electrical activity in sections of tissue removed during surgery on epileptic patients and analyzing their structure using fluorescent microscopy, the neurologists found individual cells in the cortex used not just the usual sodium ions to 'fire', but calcium as well.


This combination of positively charged ions kicked off waves of voltage that had never been seen before, referred to as a calcium-mediated dendritic action potentials, or dCaAPs.


Brains – especially those of the human variety – are often compared to computers. The analogy has its limits, but on some levels they perform tasks in similar ways.


Both use the power of an electrical voltage to carry out various operations. In computers it's in the form of a rather simple flow of electrons through intersections called transistors.


In neurons, the signal is in the form of a wave of opening and closing channels that exchange charged particles such as sodium, chloride, and potassium. This pulse of flowing ions is called an action potentialREAD MORE...

Wednesday, February 23

Quantum Computer Mind


Quick: what’s 4 + 5? Nine right? Slightly less quick: what’s five plus four? Still nine, right?

Okay, let’s wait a few seconds. Bear with me. Feel free to have a quick stretch.

Now, without looking, what was the answer to the first question?

It’s still nine, isn’t it?

You’ve just performed a series of advanced brain functions. You did math based on prompts designed to appeal to entirely different parts of your brain and you displayed the ability to recall previous information when queried later. Great job!

This might seem like old hat to most of us, but it’s actually quite an amazing feat of brain power.

And, based on some recent research by a pair of teams from the University of Bonn and the University of Tübingen, these simple processes could indicate that you’re a quantum computer.

Let’s do the math

Your brain probably isn’t wired for numbers. It’s great at math, but numbers are a relatively new concept for humans.

Numbers showed up in human history approximately 6,000 years ago with the Mesopotamians, but our species has been around for about 300,000 years.

Prehistoric humans still had things to count. They didn’t randomly forget how many children they had just because there wasn’t a bespoke language for numerals yet.

Instead, they found other methods for expressing quantities or tracking objects such as holding up their fingers or using representative models.

If you had to keep track of dozens of cave-mates, for example, you might carry a pebble to represent each one. As people trickled in from a hard day of hunting, gathering, and whatnot, you could shift the pebbles from one container to another as an accounting method.

It might seem sub-optimal, but the human brain really doesn’t care whether you use numbers, words, or concepts when it comes to math.

Let’s do the research

The aforementioned research teams recently published a fascinating paper titled “Neuronal codes for arithmetic rule processing in the human brain.”

As the title intimates, the researchers identified an abstract code for processing addition and subtraction inside the human brain. This is significant because we really don’t know how the brain handles math.

You can’t just slap some electrodes on someone’s scalp or stick them in a CAT scan machine to suss out the nature of human calculation.

Math happens at the individual neuron level inside the human brain. EKG readings and CAT scans can only provide a general picture of all the noise our neurons produce.

And, as there are some 86 billion neurons making noise inside our heads, those kinds of readings aren’t what you’d call an “exact science.”

The Bonn and Tübingen teams got around this problem by conducting their research on volunteers who already had subcranial electrode implants for the treatment of epilepsy.

Nine volunteers met the study’s criteria and, because of the nature of their implants, they were able to provide what might be the world’s first glimpse into how the brain actually handles math.  READ MORE...

Monday, February 14

Cognitive Computing


The goal of cognitive computing is to simulate human thought processes in a computerized model. Using self-learning algorithms that use data mining, pattern recognition and natural language processing, the computer can mimic the way the human brain works.

While computers have been faster at calculations and processing than humans for decades, they haven’t been able to accomplish tasks that humans take for granted as simple, like understanding natural language, or recognizing unique objects in an image.


Some people say that cognitive computing represents the third era of computing: we went from computers that could tabulate sums (1900s) to programmable systems (1950s), and now to cognitive systems.

These cognitive systems, most notably IBM ’s Watson, rely on deep learning algorithms and neural networks to process information by comparing it to a teaching set of data. The more data the system is exposed to, the more it learns, and the more accurate it becomes over time, and the neural network is a complex “tree” of decisions the computer can make to arrive at an answer.

What can cognitive computing do?

For example, according to this TED Talk video from IBM, Watson could eventually be applied in a healthcare setting to help collate the span of knowledge around a condition, including patient history, journal articles, best practices, diagnostic tools, etc., analyze that vast quantity of information, and provide a recommendation.  READ MORE.