[FRIAM] Run with a single bit?

Tom Johnson tom at jtjohnson.com
Sun Jul 2 16:36:14 EDT 2017


Many thanks for the insightful explanation, Steve.
T

===================================
Tom Johnson - Inst. for Analytic Journalism
Santa Fe, NM
tom at jtjohnson.com               505-473-9646
===================================

On Jul 2, 2017 12:33 PM, "Stephen Guerin" <stephen.guerin at simtable.com>
wrote:

> Looking into Ofer Dekel's work, the journalist messed up this sentence:
>
> compress neural networks, the synapses of Machine Learning, down from 32
> bits to, sometimes, a single bit
>
>
>
> The team is not compressing neural networks to one-bit. They are
> compressing the weights used in a neural network from 32-bits to one-bit.
>
> Weights (or sometimes called the biases) are proxies for synapses in
> biological neural networks. The weights in a neural network are the
> activation strengths (or inhibition if negative) on each incoming link to a
> node which are multiplied by the outgoing signal strength of the uplink
> neighbor. As a number, it can be expressed in 32-bits or as low as 1 bit.
> Though at 1-bit weight would not allow for inhibition.The weights are what
> get tuned during the machine learning. One can also explore the topology of
> the neural network (which nodes are connected to which) during learning and
> is the basis for the new craze around Deep Learning. This  technique has
> been around since the 90s but has now realized its use with the
> availability of data. Eg, here's a paper of mine
> <http://www.academia.edu/download/5213114/objectgarden.pdf> from '99
> implementing some of research from UT Austin at the time.
>
> I think it would have been more clear for the journalist to write :
>
> Ofer Dekel's team is researching methods to reduce the memory and
> processing requirements of Neural Networks to run on smaller devices on the
> edge of the network closer to the sensors. One method is to reduce the
> number of bits necessary describe the weights between nodes in a neural
> network down from 32-bits to as little as one-bit.
>
> -S
> _______________________________________________________________________
> Stephen.Guerin at Simtable.com <stephen.guerin at simtable.com>
> CEO, Simtable  http://www.simtable.com
> 1600 Lena St #D1, Santa Fe, NM 87505
> office: (505)995-0206 <(505)%20995-0206> mobile: (505)577-5828
> <(505)%20577-5828>
> twitter: @simtable
>
> On Sat, Jul 1, 2017 at 11:42 PM, Tom Johnson <tom at jtjohnson.com> wrote:
>
>> Friam Friends:
>>
>> A recent article
>> <http://mashable.com/2017/06/29/microsoft-puts-ai-on-a-raspberry-pi/#HKAb_h1pvaqc>
>> passed along by George Duncan says:
>>
>> "Now, Varma's team in India and Microsoft researchers in Redmond,
>> Washington, (the entire project is led by lead researcher Ofer Dekel) have
>> figured out how to *compress neural networks, the synapses of Machine
>> Learning, down from 32 bits to, sometimes, a single bit *and run them on
>> a $10 Raspberry Pi
>> <http://%20%28the%20entire%20project%20is%20led%20by%20ofer%20dekel%29/>,
>> a low-powered, credit-card-sized computer with a handful of ports and no
>> screen."
>>
>> How, or what, can you do with a "single bit."?
>>
>> TJ
>>
>> ============================================
>> Tom Johnson
>> Institute for Analytic Journalism   --     Santa Fe, NM USA
>> 505.577.6482 <(505)%20577-6482>(c)
>> 505.473.9646 <(505)%20473-9646>(h)
>> Society of Professional Journalists <http://www.spj.org>
>> *Check out It's The People's Data
>> <https://www.facebook.com/pages/Its-The-Peoples-Data/1599854626919671>*
>> http://www.jtjohnson.com                   tom at jtjohnson.com
>> ============================================
>>
>> ============================================================
>> FRIAM Applied Complexity Group listserv
>> Meets Fridays 9a-11:30 at cafe at St. John's College
>> to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
>> FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
>>
>
>
> ============================================================
> FRIAM Applied Complexity Group listserv
> Meets Fridays 9a-11:30 at cafe at St. John's College
> to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com
> FRIAM-COMIC http://friam-comic.blogspot.com/ by Dr. Strangelove
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://redfish.com/pipermail/friam_redfish.com/attachments/20170702/a2a5b9d4/attachment.html>


More information about the Friam mailing list