Notes on computer architecture


No Comments

PDF version: Notes on computer architecture – Logan Thrasher Collins

Main memory

  • Some computers store data using flip-flop circuits. Each flip-flop circuit possesses a configuration of logic gates (including AND, OR, and NOT gates) that allows Fig.1switching between “on” and “off” states corresponding to 1 and 0.
  • More modern machines often use conceptually similar ways of storing data that involve using tiny electric charges to represent 1 and 0 states.
  • Each memory cell contains eight flip-flop circuits (or similar storage devices) that correspond to eight bits of memory. Together, eight bits are equal to one byte.
  • The memory cell’s eight bits are depicted as arranged in a line. The leftmost end is called the high-order end and the rightmost end is called the low-order end. The leftmost bit is called the most significant bit and the rightmost bit is called the least significant bit.
  • In order for the computer to find specific memory cells within main memory, every cell is assigned a unique numeric address. This can be visualized as a series of memory cells lined up and numbered starting with zero. In this way, individual cells are not only identifiable, but they are also ordered relative to other cells.
  • Since the computer can independently access any cell that is needed for a Fig.2computation (despite the cells possessing an ordered configuration), the main memory is called random access memory (RAM).
  • For computers that use tiny charges (rather than flip-flop circuits) to store data, the main memory is called dynamic RAM or DRAM because the charges are volatile, dissipate quickly, and must be restored many times per second using a refresh circuit.

Central processing unit

  • The central processing unit (CPU) includes an arithmetic unit that performs operations on data, a control unit that coordinates the machine’s activities, and a register unit that temporarily stores results from the arithmetic unit (and other data) in registers. Fig.3
  • The CPU is connected to the main memory (which is more permanent than the registers) via a collection of wires called a bus. To perform an operation on data from the main memory, the CPU uses an electronic address to find the desired data cell and send it to a set of registers. To write data into the proper location within main memory, the CPU uses a similar address system.

The stored program

  • Instructions for the CPU’s data manipulation can be stored in a computer’s main memory because programs and data are not fundamentally distinct entities.
  • The following steps summarize how stored programs operate.
    1. Retrieve a set of values from main memory and place each value within a register.
    2. Activate the circuitry that performs some operation upon the values (i.e. two values might be added together) and then store the result in another register.
    3. Transfer the result from its register to main memory for long-term storage. After this, stop the program.
  • CPUs also store cache memory in order to increase their speed. The cache memory is a temporary copy of the portion of the main memory that is undergoing processing at a given time. Using cache memory, the CPU can rapidly retrieve relevant data without needing to go all the way to the main memory as often.

Machine language

  • Data transfer group: instructions to “transfer” data from a memory cell to a register (or some similar process) are more accurately described as “copying” the data. Requests to copy data from are memory cell to a register are called LOAD instructions. Requests to copy data from a register and write it to a memory cell are called STORE instructions. Requests that control interaction of the CPU and main memory with external devices like printers and keyboards are referred to as I/O instructions.
  • Arithmetic/logic group: the arithmetic/logic unit can carry out instructions that run data through basic arithmetic operations and Boolean logic gate operations (AND, NOT, OR, XOR, etc.) The arithmetic/logic unit also uses the SHIFT and ROTATE instructions. SHIFT moves bits to the left or right within a register. ROTATE is another version of SHIFT which moves bits to the slots at the other end of the register (rather than allowing them to “fall off” as would happen if SHIFT were used).
  • Control group: contains instructions that direct program execution and termination. JUMP (also called BRANCH) commands cause a program to change the next action that it performs. JUMP commands can be unconditional or conditional (when conditional, they work like “if” statements). The STOP command also falls into this category.

Machine cycle

  • The machine cycle involves two special purpose registers, the instruction register and the program counter.
  • The instruction register contains the instruction that is undergoing execution.
  • The program counter contains the address of the next instruction that will be executed and so keeps track of the machine’s place within the program.
  • Using three steps, the CPU performs the machine cycle.
    1. Fetch: the CPU retrieves an instruction from the main memory at the address specified by its program counter. The program counter then increments to specify the next instruction.
    2. Decode: the CPU breaks the instruction into appropriate components based on its operational code.
    3. Execute: the CPU activates the necessary circuitry to perform the command that was requested.
  • The computer’s clock is a circuit that generates oscillating pulses which control the machine cycle’s rate. A faster clock speed results in a faster machine cycle. Clock speed is measured in Hertz. Typical laptop computers (as of 2018) run at clock speeds of several GHz.
  • To increase a computer’s performance, pipelining is often used. Pipelining involves allowing the steps of the machine cycle to overlap. Using pipelining, an instruction can be fetched while the previous operation is still underway, multiple instructions can be fetched simultaneously, and multiple operations can be executed simultaneously so long as they are independent of each other.

Multiprocessor machines

  • Some computers possess multiple CPUs that are linked to the same main memory. This is called a multiple-instruction stream multiple-data stream (MIMD) architecture. The CPUs operate independently while coordinating their efforts by writing instructions to each other on their shared memory cells. In this way, a CPU can request another CPU to perform a specified part of a large processing task.
  • Some computers use multiple CPUs that are linked together so as to perform the same sequence of instructions simultaneously upon distinct datasets. This is called a single-instruction stream multiple-data stream (SIMD) architecture. SIMD machines are useful when the application requires the same task to be performed upon a large amount of data.
  • Parallel processing can also be carried out using large computers that are composed of multiple smaller computers, each with its own CPU and main memory. In these cases, the smaller computers coordinate the partitioning of resources to handle a given task.

 

Reference and image source: Brookshear, J. G., Smith, D. T., & Brylow, D. (2012). Computer Science: An Overview. Addison-Wesley.

Topics of Interest


No Comments

Architecture: antebellum architecture, architecture that incorporates extensive electronic components, architecture that merges seemingly disparate styles, atomic age architectural design, Baroque architecture, bionic architecture, Burj Khalifa, civic engineering, contemporary architecture, contemporary gothic architecture, Googie architecture, gothic architecture, green architecture, industrial architecture, lighting design in architecture, skyscraper design, supertall skyscrapers, supertall skyscrapers, urban planning

Art and writing: algorithmic art, antebellum fiction and poetry, aquapunk writing, art that incorporates industrial motifs, atomic age design, Baroque art, bioart, biopunk writing, body modification art, code poetry, contemporary American poetry, contemporary fairy tales, contemporary installation art, cyberart, cyberpoetry, cyberpunk writing, dieselpunk writing, futuristic fashion, Googie art, goth fashion, gothic art, industrial photography, Ken Rinaldo, literary science fiction, literary tales of vampirism, literary tales of witchcraft, Lovecraftian writing, magical realism, middle eastern literature, modern and contemporary American literature, multimedia art, nanopunk writing, Natasha Vita More, Neil Harbisson, Neri Oxman, paleoart, performance art, posthuman art, posthuman fiction and poetry, radical fashion design, retro fashion, retrofuturism, retro science fiction, robots as art, sexuality in literature, Simon Stålenhag, speculative poetry, synthpop and electronic dance music, tattoo art, the femme fatale in literature, the gothic, the literature of monsters, the weird, utopian science fiction, writing inspired by atomic age design, writing that explores machine consciousness, writing that explores the interplay between danger and desire, writing that merges seemingly disparate genres, Yayoi Kusama

Chemistry: chemical kinetics, computational chemistry, computational protein folding, conductive polymers, flux balance analysis, industrial chemical manufacturing, inorganic chemistry, medicinal chemistry, molecular orbital theory, organic supramolecular chemistry, organic synthesis, organometallic chemistry, PEDOT, quantum chemistry, solid state chemistry, statistical thermodynamics

Earth and space sciences: astrobiology, biogeochemistry, black holes, caves, desert ecology, dinosaurs, evolution of early mammals, exotic atmospheric phenomena, galactic superclusters, large-scale structure of the universe, marine biology, Mars, Neptune, ocean chemistry, paraceratherium, physical processes in stars, prehistoric sea creatures, telescopes, troglodytic organisms, tropical ecology, tundra ecology, underwater caves, urban ecology, velociraptors

Economics: American economic system, economics as ecology, economics of healthcare, entrepreneurship, Japanese economics, macroeconomics, mathematical finance, microeconomics, Scandinavian economics

Electrical engineering and computer science: algorithm design, applied probability, computational approaches to solving ordinary and partial differential equations, computational geometry, computer architecture, curve fitting and optimization, embedded systems, flexible electronics, graph algorithms, high-performance computing, k-means clustering algorithms, manifold learning, MATLAB, memristors, microelectronics, nanoelectronics, neuromorphic engineering, nonlinear dimensionality reduction, optoelectronics and photonics, Python, reinforcement learning, RF circuit design, semiconductor devices, supercomputer architecture, supervised learning, unsupervised learning, VLSI design

History: Abrahamic mythology and its ties to historical trends, F. Scott Fitzgerald, historical demonology, historical perspectives on romantic love, historical perspectives on the concept of monsters, historical perspectives on witchcraft, history of American literature, history of American poetry, history of architecture, history of biomedicine, history of chemistry, history of computers and electronics, history of feminism, history of Halloween, history of mathematics, history of middle eastern literature, history of modern and contemporary art, history of science fiction, history of skyscrapers, the Cold War

Humanities: academic perspectives on popular culture, academic perspectives on sanity and insanity, academic perspectives on synthpop and electronic dance music as well as their associated cultures, American politics, culture of the American South, demonology, digital humanities, future studies, gender studies, Japanese politics, perspectives on the relationship between genius and insanity as a psychosocial and cultural phenomenon, posthuman studies, romantic love as a psychosocial and cultural phenomenon, sexuality studies, structure and culture of criminal organizations, studies on contemporary Japanese culture, study of robots as a sociocultural phenomenon, study of vampires as a sociocultural phenomenon, study of witchcraft as a sociocultural phenomenon, queer studies, the femme fatale

Mathematics: abstract algebra, algebraic geometry, algebraic topology, calculus of variations, category theory, chaos theory, combinatorics, complex analysis, complex and hypercomplex geometry, convexity, differentiable and smooth manifolds, differential forms, differential geometry, dynamical systems, exotic spheres, Fourier analysis, fractal geometry, fractional calculus, functional analysis, geometric analysis, graph theory, group theory, harmonic analysis, high-dimensional sphere packing, hypercomplex numbers, independent component analysis, integral equations, knot theory, linear programming, measure theory, number theory, numerical analysis, ordinary and partial differential equations, principal component analysis, probability theory, random graphs, real analysis, set theory, stochastic geometry, surgery theory, tensors, tools from linear algebra, topological data analysis, topological manifolds, topology, tropical geometry, very large numbers

Molecular and synthetic biology: Adeno-associated viruses, bioactive natural products, bioinformatics, cardiac developmental biology, cardiac molecular biology, catalytic and regulatory RNAs, cell signaling, cell-based therapeutics, cytoskeleton, directed evolution, engineering logic gates using synthetic gene regulatory pathways, epigenetics and gene regulation, gastrointestinal physiology, gene therapy, glycobiology and glycomics, HIV biology, inorganic biochemistry, limb development, lipidomics, membrane biology, metabolic engineering, metabolomics, molecular biology of aquatic organisms, molecular biology of fungi, molecular biology of muscles, molecular biology of vesicles and vesicular trafficking, molecular biology of yeast, molecular biology techniques, molecular endocrinology, molecular entomology, molecular genetics of bacteria, molecular immunology, molecular mechanobiology, nanopore sequencing, next-generation sequencing, organ-on-a-chip technologies, protein engineering, protein folding, proteomics, regenerative biology, specialized PCR variants, specialized forms of gel electrophoresis, structural biology, subnuclear organelles, the Golgi apparatus, the microbiome, tissue engineering, transcriptomics, virology

Nanotechnology: bioconjugation, bionanotechnology, carbon nanotubes, crystalline lattices for nanotechnology, DNA origami and other DNA nanotechnology, gold nanoparticles for biotechnology, graphene, mass production of nanotechnology, nanoelectronics, nanomechanics, nanorobotics, nanoscale membranes, nanotechnology-based drug delivery systems, optical interfacing with nanoscale systems, polymer engineering, quantum dots, quantum tunneling in nanotechnology, rotaxanes and catenanes, self-assembly, self-replicating machines, supramolecular chemistry, upconversion nanoparticles, using proteins as components within nanotechnological systems

Neuroscience: affective neuroscience, amygdala, application of X-ray microscopy to neuroscience, applications of higher mathematics to neuroscience (i.e. algebraic geometry, algebraic topology, category theory, etc.), applications of nanotechnology to neuroscience, applications of signal processing to neuroscience, auditory cortex, bioinspired artificial intelligence and robotics, biophysical theory of dendritic voltage propagation, brain-machine interfaces, cephalopod neurobiology, cerebellar circuitry, cerebral organoids, computations in the primary motor cortex, connectomics, contrast agent design, developmental neurobiology, engineering new types of electrodes and optrodes, enteric nervous system, expansion microscopy, graph theoretic models of neuronal connectivity, Hodgkin-Huxley models, honeybee cognition, image processing for 3D reconstruction of neuronal microanatomy, information processing in the peripheral nervous system, injectable electronics, insect sensory neurobiology, interactions of materials with brain tissue, light-sheet microscopy for neuroscience, mathematical models of synaptic potentiation, mechanisms of information coding in neural systems, microinsect neurophysiology (i.e. parasitoid wasps), molecular biology of the synapse, morphological diversity of neurons, MRI and fMRI, multielectrode arrays, neural circuits in the spinal cord, neural dust, neural mass models, neuroacarology, neurobiology of exotic arthropods, neuroendocrinology, neuroimmunology, neuroinformatics, neuromodulators, neuromuscular interactions, neuromyrmecology, NEURON, neurophysics, neuroscience of glial cells, neuroscience of language, neuroscience of reptiles, neurotechnology, noncoding RNAs in the brain, nucleus accumbens, optogenetics, phenomenological models of single neurons, retinal computations, reward system, RNA-seq for characterizing neurons, simulating large populations of multicompartmental Hodgkin-Huxley-type neurons, spider neurobiology, synthetic neurobiology, thalamus, tool development for connectomics, two-photon microscopy for neuroscience

Philosophy: affective philosophy, bioethics, epistemology, ethics of animal suffering, ethics of cerebral organoids, ethics surrounding machine consciousness, existentialism, extropianism, feminist philosophy, monism, ontology, panpsychism, philosophical definitions of the body, philosophy of art, philosophy of gender and sexuality, philosophy of mathematics, philosophy of science, philosophy of technology, philosophy regarding the fundamentals of good and evil, posthumanism, rational romanticism, romanticism, transhumanism, utilitarianism and deontology

Physics: computational physics, differential geometry of spacetime, diffusion, electrodynamics, general relativity, high energy physics, long-term fate of the universe, M-theory, metallurgy, optics, physical theory of protein folding, polymer physics, quantum computing, quantum field theory, quantum mechanics, radio frequency physics, solid state physics, solid state quantum mechanics, solid state thermodynamics, string theory, theory of classical mechanics, thermodynamics, topological matter, X-ray physics

 

Interesting Lab Websites


No Comments
  1. Alexander-Katz group: lipid bilayer physics, biopolymer physics, self-assembly http://web.mit.edu/soft-materials/

 

  1. Anikeeva lab: bioelectronics, flexible neural probes, optoelectronics, magnetic devices http://www.rle.mit.edu/bioelectronics/

 

  1. Baker lab: computational protein engineering, de novo protein design, the protein folding problem, homology modeling, ab initio modeling, crowdsourcing methods for protein folding https://www.bakerlab.org/index.php/2016/09/16/the-coming-of-age-of-de-novo-protein-design/

 

  1. Barron group: using the bee brain as a basis for understanding cognition, biomimetic artificial intelligence, insect neurobiology http://andrewbarron.org/

 

  1. Berger group: hippocampal prosthesis, brain-brain interfacing, signal processing, implantable neuroelectronics https://cne.usc.edu/

 

  1. Boyden lab: synthetic neurobiology, expansion microscopy, optogenetic tools, connectomics, directed evolution, protein engineering, optical tools for neuroscience http://syntheticneurobiology.org/

 

  1. Bruns lab: nanomechanical devices, supramolecular chemistry, rotaxanes https://www.emergentnanomaterials.com/

 

  1. Chittka lab: honeybee neurobiology and ecology, sensory neurobiology of honeybees, computational neuroscience http://chittkalab.sbcs.qmul.ac.uk/index.html

 

  1. Chung lab: CLARITY, related tools for connectomics http://www.chunglab.org/

 

  1. Church lab: synthetic biology, DNA nanotechnology, tools for systems biology, evolutionary biology, genome engineering, CRISPR, aging research, tissue engineering, nanopore sequencing http://arep.med.harvard.edu/

 

  1. Cohen lab: computational approaches to neural oscillations, experimental approaches to neural oscillations http://mikexcohen.com/#books

 

  1. Douglas lab: DNA nanotechnology, protein engineering, nanorobotics https://bionano.ucsf.edu/

 

  1. Doyle group: microparticles for biomedicine, microfluidics, DNA polymer physics https://doylegroup.mit.edu/

 

  1. Häusser group: neural computations in the cerebellum and neocortex, recording neural activity with Neuropixels, focused ion beam scanning electron microscopy for connectomics, simultaneous two-photon imaging and optogenetic manipulation, patch-clamp tools http://www.dendrites.org/about/

 

  1. Holten group: bio-inspired interfaces, materials science, polymers https://sites.google.com/site/holtengroup/

 

  1. Horiuchi group: computational sensorimotor neuroscience, neuromorphic VSLI design, neural computation in bats, mobile robotics inspired by neural computations in bats https://isr.umd.edu/Labs/CSSL/horiuchilab/horiuchilab.html

 

  1. Jeong lab: flexible electronics, brain-machine interfaces, biophotonics, wearable electronic “tattoos” https://www.jeongresearch.org/

 

  1. Ji lab: optical microscopy tools for neuroscience, neural circuits, computation in visual pathways https://www.jilab.net/research/

 

  1. Johnson group: branched polymer nanomaterials, hydrogel networks, semiconducting organometallic polymers http://web.mit.edu/johnsongroup/#

 

  1. Lieber lab: injectable electronics, biomaterials, brain-machine interfaces, flexible electronics, immunological responses to implanted electronics http://cml.harvard.edu/

 

  1. Lytton group: computational neuroscience, multiscale modeling of neurobiological systems, software development for biophysical modeling, dendritic processing models, network models, molecular models http://neurosimlab.com/content/research

 

  1. Maharbiz lab: neural dust, implantable microelectronics, brain-computer interfaces, bioelectronics, electrical engineering https://maharbizgroup.wordpress.com/

 

  1. Olsen lab: polymers, protein engineering, network chemistry, nanotechnology http://olsenlab.mit.edu/

 

  1. Pessoa lab: emotion and cognition, computational neuroscience, affective brain networks http://www.lce.umd.edu/

 

  1. Ramirez group: neurobiology of learning and memory, engineering memories using optogenetics and other techniques in order to treat psychiatric disorders http://theramirezgroup.org/research/

 

  1. Schiller lab: cortical computation, single neuron computation, plasticity mechanisms in cortex, sensorimotor learning mechanisms https://schillerj.net.technion.ac.il/projects/

 

  1. Sestan lab: spatial transcriptomics in the brain, computational neuroscience, systems neuroscience, RNA sequencing https://medicine.yale.edu/lab/sestan/resources/

 

  1. Wang lab: nanobioelectronics, nanorobotics, nanobiosensors, flexible materials http://joewang.ucsd.edu/

 

 

 

Notes on neural mass models


No Comments

PDF version: Notes on neural mass models – Logan Thrasher Collins      

Homogenous and heterogenous populations of neurons

  • The simplest type of neural mass model involves assuming a homogenous population of neurons. This means that all neurons are coupled to each other and to themselves with an equal interaction strength wij=w0. In graph theoretic terms, this is a complete graph with self-edges at every vertex. Furthermore, all neurons receive the same amount of externally applied current Iext(t). As a consequence of these approximations, this type of model can only be used for large populations of neurons.
  • Neural coupling strengths that are less than zero are inhibitory. Neural coupling strengths that are greater than zero are excitatory.
  • Population activity is defined by the equation below. Note that this equation is not specific to homogenous populations, it can be used for many other types of models as well. N refers to the total number of neurons in the population while nspikes counts the number of spikes between time t and a subsequent time t+Δt. The δ is the Dirac delta function and tjf is the time at which neuron j fires.

Eq1

  • The electrophysiological activities of integrate-and-fire neurons are defined by the following differential equation and its solutions where τm is the membrane time constant (which equals the membrane resistance R times the membrane capacitance C), I(t) is the input current, and u is the membrane voltage. If the value of u passes a threshold ϴ, then a spike occurs and u is reset to the resting potential urest.

Eq2

  • One of the tools necessary for describing the activity of a homogenous population of integrate-and-fire neurons is a function α(t – tif) which represents the postsynaptic current generated by an input spike. Depending on the shape of the curve used to model the postsynaptic current, α(t – tif) might take on different forms.
  • With all neurons are coupled to each other (and to themselves) in a homogenous population, the total current in any given neuron is the externally applied current plus the sum of all postsynaptic currents from input spikes multiplied by each interaction weight wij.

Eq3

  • For homogenous populations with homogenous all-to-all coupling and a constant interaction strength w0, the total current is the same in every neuron. This current is given by the following equation (since we can assume a continuum for a large population of neurons). The reason that the integral is multiplied by w0N is that every neuron is connected to the given neuron. Here, s represents the time at which a spike occurs.

Eq4

  • Consider a population in which each neuron has slightly different parameters from the others such that the firing rates ri(I(t)) vary over the population despite each neuron receiving the same input current. If the population is large, then the function which describes the variation in firing rate can be linearized around the average firing rate (and neglecting the higher-order terms of the Taylor series). As such, this simplification (the linearized model) can still be useful for some applications.

Eq5

  • The above expression can also be thought of as indicating that the mean firing rate of the population is equal to the firing rate of a “typical” neuron (with “typical” parameters) in the population.
  • In cases that involve more dramatic variations within populations of neurons, the averaging technique described above is insufficient. For instance, consider a population in which half the neurons are described by a set of parameters p1 and the other half by a set of significantly different parameters p2. This population should be split and regarded as two homogenous populations.
  • Indeed, any population composed of subsets which differ significantly from each other should be decomposed into the homogenous subsets. The same applies to populations composed of neurons with identical parameters, but with subsets that receive significantly different input currents.

Connectivity schemes and scaling

  • Using these techniques, populations of different sizes can give similar results if a scaling law is applied to the connection weights. For a homogenous population with all-to-all connectivity, the appropriate scaling law is as follows. J0 is the number of neurons before scaling and N is the number of neurons after scaling.

Eq6

  • Increasing the size of a population while keeping its connectivity the same allows for noise reduction. This is especially useful since some populations are quite small. For instance, a single layer within a cortical column might have only a few hundred neurons.
  • Another all-to-all coupling model involves using a Gaussian distribution of weights with the following mean and standard deviation (σ0 is the standard deviation of weights prior to scaling).

Eq7

  • Populations can also be modeled by setting a fixed coupling probability p (among N2 possible connections). In this type of model, the mean number of connections to a neuron j is then given by pN and the variance is p(1 – p)N. Alternatively, each neuron j can send outputs to pN partners. To scale a population with a fixed coupling probability, the equation below is used so that the average number of inputs to each neuron does not change as the population size changes.

Eq8

  • Some simulations can assume a balanced population of excitatory and inhibitory neurons. In such cases, the mean input current is zero, so scaling the connection weights does not influence the mean. Instead, the weights should be scaled with respect to how they affect fluctuations about zero. This can be achieved using the following scaling equation.

Eq9

Interacting populations

  • In the previous sections, balanced populations of excitatory and inhibitory neurons were used. Now, consider homogenous populations each consisting of either excitatory or inhibitory neurons, but not both. Fig.1
  • These populations can be visualized as spatially separated from each other, but this is not necessary for the model to work (and it is not biologically realistic). The populations could just as easily be spatially mixed.
  • The activity of neurons in homogenous population n is given by the equation below. The parameter Γn represents the set of neurons belonging to population n.

Eq10

  • With all-to-all coupling, each neuron i within pool n is assumed to receive inputs from every neuron j within pool m. The connection strength is wij=JnmNm where Jnm is the strength of an individual coupling from a neuron in pool m to a neuron in pool n and Nm is the number of neurons in pool m. As such, the input current to a neuron i will come from all the spikes in the network. Once again, α represents some given type of postsynaptic voltage-time function after an input current.

Eq11

  • The input current can also be formulated by the equation below. Since the model provides identical input current to all neurons, the index i can be removed. Once again, s represents the time at which a spike occurs.

Eq12

Distance-dependent connectivity

  • To better model neural populations, distance can provide an approximate measure for coupling probability (with more distant neurons having a lower probability of coupling). It should be noted that this is still a very rough model. Fig.2
  • In order to create a model with distance-dependent connectivity, each neuron i must be assigned a location x(i) on a two-dimensional cortical sheet.
  • For this type of model, all connections are assigned the same weight and the connection probability depends on distance (see part A of the diagram). P is a function which maps any vector to a real number on the interval [0,1].

Eq13

  • Alternatively, all-to-all coupling can be assumed with a strength wij that decreases with distance (see part B of the diagram). This is modeled by the following equation where g is a function that maps any vector to a real number. (For example, a Euclidean length metric inside of a decaying exponential).

Eq14

Spatial continuum models

  • Many neural populations in the brain exhibit properties which continuously vary across space (i.e. tonotopy and retinotopy). Of course, this kind of variation is not actually continuous at the level of individual neurons, but it is effectively continuous from the perspective of population modeling. Sets of homogenous populations cannot account for such continuous variation, so spatial continuum Fig.3models must be used when considering this kind of functional organization.
  • Consider a continuum of neurons along a one-dimensional axis and assume all-to-all coupling with connection strength dependent upon distance. This model uses the equation wij = g(|x(i) – x(j)|) as described in the previous section. Then discretize space into segments of length d. The number of neurons in a segment n is given by the following equation where ρ represents the density of neurons. Neurons within this interval belong to the set Γm.

 Eq15

  • For continuum models, the population activity of population m is described as a function of time and of the spatial position of the neurons belonging to population m. The latter is given by md since the distance along the axis is equal to the segment length times the index m.
  • The coupling strength between a neuron x(i) at location nd and a neuron x(j) at location md is a function w(nd,md) that defines a weighting measure depending on the distance between the two locations nd and md.
  • As such, the input current to a neuron in population m is computed using the following equation (top). The product of an input current from population n with the population activity induced in population m is found inside of the summation. All input currents to population m from the rest of the populations are then summed. For a large number of populations, this equation can be replaced by an equation with a second integral. For convenience, md has been replaced with y in these equations.

Eq16

 

Reference: Gerstner, W., Kistler, W. M., Naud, R., & Paninski, L. (2014). Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. New York, NY, USA: Cambridge University Press.