Open Side Menu Go to the Top
Register
Juno is a top notch neutrino observatory (LC Thread) Juno is a top notch neutrino observatory (LC Thread)

06-09-2017 , 06:24 AM
Quote:
Originally Posted by Noodle Wazlib
Tonight I learned that raccoons have very large families and sometimes they go scavenging in packs.
I learned that sometimes a female skunk will have babies under your shed, and then die. Fun times.
06-09-2017 , 09:09 AM
Quote:
Originally Posted by Noodle Wazlib
Tonight I learned that raccoons have very large families and sometimes they go scavenging in packs.
Snake hunting packs... racoon hunting packs... waaf
06-09-2017 , 09:47 AM
For a while (long time ago) we had a family of raccoons who would come by every night to eat the leftover cat food our outdoor cats had. You could just hand them a piece of cat food (though most of the time we chased them off).
06-09-2017 , 01:04 PM
06-09-2017 , 03:11 PM
A take that it was Gawker that ultimately is the source for taking Cosby down. Mostly I like it because of the idea of the accumulated circle of people willing to say things.

Quote:
In reality it was Scocca, and Gawker, who made the story safe for Buress to talk about. Not to knock Buress in any way: It was his huge, diverse audience and the incisiveness and wit of his Cosby bit that really blew the story open, as if Scocca had pitched the ball and Buress had hit a home run. (We know that Buress read Scocca’s piece because he tweeted about it within a couple of days of publication, in a since-deleted tweet.) The defendants in the Cosby trial, in short, owe everything to one journalist’s memory at the late, lamented Gawker, the unpopular truth-telling publication ignominiously killed off by billionaire sneak Ayn Rand fan and loser Peter Thiel, who is now to be found staffing sensitive jobs in the White House with his fellow loser Silicon Valley libertarian loser freak friends.

I’m not writing this because I think Scocca should be given the credit for setting the events in motion that brought Cosby into this courtroom, though I believe he should. I’m writing this because I strongly believe that if we want media that can take on powerful interests effectively, the path by which this particular story was buried, and then came back to light, is important. By settling with Constand and paying for a ton of favorable media, by paying tons of PR people and lawyers and god knows who else, Cosby had succeeded in creating public amnesia to the degree that he continued to collect awards and honors; he won the Mark Twain Prize in 2009 and the Marian Anderson Award in 2010 (honoring “critically acclaimed artists who have impacted society in a positive way”); he was in talks for a new sitcom on NBC, since dropped.

Because of Scocca’s piece, every writer who came after him to discuss Cosby was shielded from the criticism that comes with insulting a popular public figure. The information was already out there to be commented on. When I suggested that it had been he who “made it safe,” Scocca said, “There are just sort of ever-expanding circles of safety, right? It was safe to write the thing on Gawker because Philadelphia Magazine had done a good piece before, and Barbara Bowman had come forward before.

“Safety comes from the accumulated willingness of people to say things,” he continued. “Everything that happened, happened because somebody had taken a step forward, right? So Katie Baker went to [Cosby’s accusers] and got their stories after the Gawker piece, and that created a body of material that was there for Hannibal Buress. And once Hannibal Buress had referenced it, then people felt that they could talk more about the story. It’s any number of things that come together.”
http://www.deathandtaxesmag.com/3337...ca-bill-cosby/
06-09-2017 , 08:49 PM
I blame Nader ofc. He wrote "Only the Super Rich can Save Us" imagining a world where Cosby and Donahue, etc... pooled their resources are brought about real change. It's like he made a hitlist for the Massive Right-Wing Conspiracy!
06-09-2017 , 11:15 PM
Quote:
Originally Posted by Riverman
I don't know, Travis sounds like he would be a pretty cool guy to work for.
06-09-2017 , 11:27 PM
von-Neumann architecture processors, the design of which has been around since 1945 and is basically in every device with a CPU today I think, are getting replaced in big data projects by a new HIVE processor. Intel and Qualcomm are working with DARPA to create the new architecture

https://hardware.slashdot.org/story/...e-of-processor
06-09-2017 , 11:47 PM


Might be a slow pony, but I liked this.
06-10-2017 , 12:04 AM
Rack another one up in the Millennals column



https://twitter.com/VancouverSun/sta...75174765854720
06-10-2017 , 12:10 AM
Quote:
Originally Posted by Riverman
It's from 2013, who cares?
06-10-2017 , 07:11 AM
Quote:
Originally Posted by Noodle Wazlib
was reading something about the cost of climbing everest today - if you only spend $20k/person, you're probably going to die.
Got a link for this mate?
06-10-2017 , 08:03 AM
Quote:
Originally Posted by Noodle Wazlib
von-Neumann architecture processors, the design of which has been around since 1945 and is basically in every device with a CPU today I think, are getting replaced in big data projects by a new HIVE processor. Intel and Qualcomm are working with DARPA to create the new architecture

https://hardware.slashdot.org/story/...e-of-processor
Good find.

Pretty interesting, I am still trying to imagine what this architecture will look like and the exact problem it is trying to solve.

It feels like an expansion of the principles in the Harvard Architecture that are used in DSPs but there still isn't enough detail in that article to say.
06-10-2017 , 12:28 PM
Quote:
Originally Posted by Csaba
Got a link for this mate?
http://www.alanarnette.com/blog/2016...mount-everest/
06-10-2017 , 12:54 PM
Quote:
Originally Posted by Huehuecoyotl
Rack another one up in the Millennals column



https://twitter.com/VancouverSun/sta...75174765854720
Because leaking gov't information def isn't something that every generation in recorded history has engaged in. JFC, can we all just stop ****ting on Millennials for like maybe one week?
06-10-2017 , 03:54 PM
06-10-2017 , 03:59 PM
Apparently cartoon trump wears some sort of collared muscle shirt with no buttons under his suit.
06-10-2017 , 05:41 PM
Quote:
Originally Posted by Noodle Wazlib
von-Neumann architecture processors, the design of which has been around since 1945 and is basically in every device with a CPU today I think, are getting replaced in big data projects by a new HIVE processor. Intel and Qualcomm are working with DARPA to create the new architecture

https://hardware.slashdot.org/story/...e-of-processor
I understood literally nothing from that article. Is there a version for people who are dumb?
06-10-2017 , 05:42 PM
Quote:
Originally Posted by Trolly McTrollson
He nailed the tie length for the red tie, but blew it for the other one.
06-10-2017 , 11:26 PM
Looks like Trump went super saiyan.
06-10-2017 , 11:38 PM
Quote:
Originally Posted by bobman0330
I understood literally nothing from that article. Is there a version for people who are dumb?
Lack of familiarity with CPU architectures does not make one dumb!

Sadly, any explanation would likely be incredibly boring to all but the nerdiest of computer enthusiasts
06-10-2017 , 11:45 PM
also, super sad to hear about Adam West's passing.
06-11-2017 , 12:05 AM
1966 Batman movie >>>>>>>> every other Batman thing.
06-11-2017 , 12:39 AM
Quote:
Originally Posted by bobman0330
I understood literally nothing from that article. Is there a version for people who are dumb?
I will attempt to translate.

So for starters, a "graph" has a particular meaning in computer science:

Quote:
A graph data structure consists of a finite (and possibly mutable) set of vertices or nodes or points, together with a set of unordered pairs of these vertices for an undirected graph or a set of ordered pairs for a directed graph. These pairs are known as edges, arcs, or lines for an undirected graph and as arrows, directed edges, directed arcs, or directed lines for a directed graph.
So basically it's a whole set of entities and set of relations or links between them. Huge amounts of data structured like this are increasingly common in Big Data - think about a social network, for instance, people linked together in very complex ways.

Most computer processors right now use what is called Von Neumann architecture. This has a processing unit which fetches both data and instructions from a memory store. The conduit between the processor and memory is called a "bus". The CPU has to fetch instructions and then data sequentially, which creates a problem called the "Von Neumann bottleneck":

Quote:
The shared bus between the program memory and data memory leads to the von Neumann bottleneck, the limited throughput (data transfer rate) between the central processing unit (CPU) and memory compared to the amount of memory. Because the single bus can only access one of the two classes of memory at a time, throughput is lower than the rate at which the CPU can work. This seriously limits the effective processing speed when the CPU is required to perform minimal processing on large amounts of data. The CPU is continually forced to wait for needed data to be transferred to or from memory. Since CPU speed and memory size have increased much faster than the throughput between them, the bottleneck has become more of a problem, a problem whose severity increases with every newer generation of CPU.
Bolded will be important in a sec. So, there are a number of ways that have been tried to mitigate the Von Neumann bottleneck. One thing you can do is try to be predictive about what the CPU will want next. So if the CPU requests data in memory location 1, then memory location 2, then memory location 3, then you can pre-fetch memory location 4, under the assumption that that's often what the CPU will want next.

So what problems are the absolute worst for the Von Neumann bottleneck? Per bolded above, when the CPU needs to perform "minimal processing on large amounts of data", which describes traversing a graph. Things are even worse when the nodes of the graph are distributed randomly and unpredictably through memory; or "sparsely" in the jargon.

Knowing all this, you should now be able to read and understand this:

Quote:
Graph analytic processors do not exist today, but they theoretically differ from CPUs and GPUs in key ways. First of all, they are optimized for processing sparse graph primitives. Because the items they process are sparsely located in global memory, they also involve a new memory architecture that can access randomly placed memory locations at ultra-high speeds (up to terabytes per second)... The graph analytics processor is needed, according to DARPA, for Big Data problems, which typically involve many-to-many rather than many-to-one or one-to-one relationships for which today's processors are optimized.
How exactly that works, I don't know. It's more about the possible paradigm shift away from tried and true Von Neumann architecture, and the possible applications. From another article:

Quote:
A main objective of the HIVE program is to create a graph analytics processor, which can more efficiently find and represent links between data elements and categories. These could include person-to-person interactions, and disparate links such as geography, change in doctor visit trends, or social media and regional strife... DARPA believes that such a graph processor could achieve a “thousandfold improvement in processing efficiency,” over today’s best processors.
You can see the potential.

      
m