WordPress Sites Under Attack From New Zero-Day In WP Mobile Detector Plugin

An anonymous reader writes: A large number of websites have been infected with SEO spam thanks to a new zero-day in the WP Mobile Detector plugin that was installed on over 10,000 websites. The zero-day was used in real-world attacks since May 26, but only surfaced to light on May 29 when researchers notified the plugin’s developer. Seeing that the developer was slow to react, security researchers informed Automattic, who had the plugin delisted from WordPress.org’s Plugin Directory on May 31. In the meantime, security firm Sucuri says it detected numerous attacks with this zero-day, which was caused by a lack of input filtering in an image upload field that allowed attackers to upload PHP backdoors on the victim’s servers with incredible ease and without any tricky workarounds. The backdoor’s password is “dinamit,” the Russian word for dynamite.


Share on Google+

Read more of this story at Slashdot.


Original URL: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/XldKZd9pBWI/wordpress-sites-under-attack-from-new-zero-day-in-wp-mobile-detector-plugin

Original article

Blade Runner re-encoded using neural networks

Last week, Warner Bros. issued a DMCA takedown notice to the video streaming website Vimeo. The notice concerned a pretty standard list of illegally uploaded files from media properties Warner owns the copyright to — including episodes of Friends and Pretty Little Liars, as well as two uploads featuring footage from the Ridley Scott movie Blade Runner.

Just a routine example of copyright infringement, right? Not exactly. Warner Bros. had just made a fascinating mistake. Some of the Blade Runner footage — which Warner has since reinstated — wasn’t actually Blade Runner footage. Or, rather, it was, but not in any form the world had ever seen.

Instead, it was part of a unique machine-learned encoding project, one that had attempted to reconstruct the classic Philip K. Dick android fable from a pile of disassembled data.

Sample reconstruction from the opening scene of Blade Runner.

In other words: Warner had just DMCA’d an artificial reconstruction of a film about artificial intelligence being indistinguishable from humans, because it couldn’t distinguish between the simulation and the real thing.

Deconstructing Blade Runner using artificial intelligence

Terence Broad is a researcher living in London and working on a master’s degree in creative computing. His dissertation, “Autoencoding Video Frames,” sounds straightforwardly boring, until you realize that it’s the key to the weird tangle of remix culture, internet copyright issues, and artificial intelligence that led Warner Bros. to file its takedown notice in the first place.

Broad’s goal was to apply “deep learning” — a fundamental piece of artificial intelligence that uses algorithmic machine learning — to video; he wanted to discover what kinds of creations a rudimentary form of AI might be able to generate when it was “taught” to understand real video data.

As a medium, video contains a huge amount of visual information. When you watch a video on a computer, all that information has usually been encoded/compressed and then decoded/decompressed to allow a computer to read files that would otherwise be too big to store on its hard drive.

Normally, video encoding happens through an automated electronic process using a compression standard developed by humans who decide what the parameters should be — how much data should be compressed into what format, and how to package and reduce different kinds of data like aspect ratio, sound, metadata, and so forth.

Broad wanted to teach an artificial neural network how to achieve this video encoding process on its own, without relying on the human factor. An artificial neural network is a machine-built simulacrum of the functions carried out by the brain and the central nervous system. It’s essentially a mechanical form of artificial intelligence that works to accomplish complex tasks by doing what a regular central nervous system does — using its various parts to gather information and communicate that information to the system as a whole.

Broad hoped that if he was successful, this new way of encoding might become “a new technique in the production of experimental image and video.” But before that could happen, he had to teach the neural network how to watch a movie — not like a person would, but like a machine.

Do encoders dream of electric sheep? (Or, how do you “teach” an AI to watch a film?)

Broad decided to use a type of neural network called a convolutional autoencoder. First, he set up what’s called a “learned similarity metric” to help the encoder identify Blade Runner data. The metric had the encoder read data from selected frames of the film, as well as “false” data, or data that’s not part of the film. By comparing the data from the film to the “outside” data, the encoder “learned” to recognize the similarities among the pieces of data that were actually from Blade Runner. In other words, it now knew what the film “looked” like.

Once it had taught itself to recognize the Blade Runner data, the encoder reduced each frame of the film to a 200-digit representation of itself and reconstructed those 200 digits into a new frame intended to match the original. (Broad chose a small file size, which contributes to the blurriness of the reconstruction in the images and videos I’ve included in this story.) Finally, Broad had the encoder resequence the reconstructed frames to match the order of the original film.

In addition to Blade Runner, Broad also “taught” his autoencoder to “watch” the rotoscope-animated film A Scanner Darkly. Both films are adaptations of famed Philip K. Dick sci-fi novels, and Broad felt they would be especially fitting for the project (more on that below).

Broad repeated the “learning” process a total of six times for both films, each time tweaking the algorithm he used to help the machine get smarter about deciding how to read the assembled data. Here’s what selected frames from Blade Runner looked like to the encoder after the sixth training. Below we see two columns of before/after shots. On the left is the original frame; on the right is the encoder’s interpretation of the frame:




Autoencoding Video Frames
Real and generated samples from the first half of Blade Runner in steps of 4,000 frames, alternating real and constructed images.

During the six training rounds, Broad only used select frames from the two films. Once he finished the sixth round of training and fine-tuning, Broad instructed the neural network to reconstruct the entirety of both films, based on what it had “learned.” Here’s a glimpse at how A Scanner Darkly turned out:

Broad told Vox in an email that the neural network’s version of the film was entirely unique, created based on what it “sees” in the original footage. “In essence, you are seeing the film through the neural network. So [the reconstruction] is the system’s interpretation of the film (and the other films I put through the models), based on its limited representational ‘understanding.'”

Why Philip K. Dick’s work is perfect for this project

Dick was a legendary science fiction writer whose work frequently combined a focus on social issues with explorations in metaphysics and the reality of our universe. The many screen adaptations his works have inspired include Minority Report, Total Recall, The Adjustment Bureau, and the Amazon TV series The Man in the High Castle.

And then there’s his famous novel Do Androids Dream of Electric Sheep?, which formed the basis of Blade Runner, a dystopian sci-fi masterpiece and one of the greatest films ever made. In the film, Harrison Ford’s character Rick Deckard has a job that involves hunting down and killing “replicants” — an advanced group of androids that pass for humans in nearly every way. The film’s antagonist, Roy Batty, is one of these replicants, famously played by a world-weary Rutger Hauer. Batty struggles with his humanity while fighting to extend his life and defeat Deckard before Deckard “retires him.”

Dick was deeply concerned with the gap between the “only apparently real” and the “really real.” In his dissertation, Broad said that he felt using two of Dick’s works for his simulation project was only fitting:

[T]here could not be a more apt film to explore these themes [of subjective rationality] with than Blade Runner (1982)… which was one of the first novels to explore the themes of arial subjectivity, and which repeatedly depicts eyes, photographs and other symbols alluding to perception.

The other film chosen to model for this project is A Scanner Darkly (2006), another adaption of a Philip K. Dick novel (2011 [1977]). This story also explores themes of the nature of reality, and is particularly interesting for being reconstructed with a neural network as every frame of the film has already been reconstructed (hand traced over the original film) by an animator.

In other words, using Blade Runner had a deeply symbolic meaning relative to a project involving artificial recreation. “I felt like the first ever film remade by a neural network had to be Blade Runner,” Broad told Vox.

A copyright conundrum

These complexities and nuances of sci-fi culture and artificial learning were quite possibly lost on whoever decided to file the takedown claim for Warner Bros. Perhaps that’s why, after Vox contacted Warner Bros., the company conducted an investigation and reinstated the two videos it had initially taken down.

Still, Broad noted to Vox that the way he used Blade Runner in his AI research doesn’t exactly constitute a cut-and-dried legal case: “No one has ever made a video like this before, so I guess there is no precedent for this and no legal definition of whether these reconstructed videos are an infringement of copyright.”

But whether or not his videos continue to rise above copyright claims, Broad’s experiments won’t just stop with Blade Runner. On Medium, where he detailed the project, he wrote that he “was astonished at how well the model performed as soon as I started training it on Blade Runner,” and that he would “certainly be doing more experiments training these models on more films in future to see what they produce.”

The potential for machines to accurately and easily “read” and recreate video footage opens up exciting possibilities both for artificial intelligence and video creation. Obviously there’s still a long way to go before Broad’s neural network generates earth-shattering video technology, but we can safely say already — we’ve seen things you people wouldn’t believe.


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/tJMWk2_iezU/blade-runner-neural-network-encoding

Original article

Microsoft’s Official Guide for a DIY, Raspberry Pi-Powered Magic Mirror with Face Detection

Smart mirrors have been all the rage this year, and it looks like Microsoft’s getting into the game too. While Microsoft’s mirror is teased as a commercial product, they’ve released the source code if you’re interested in making one for yourself.

Read more…


Original URL: http://feeds.gawker.com/~r/lifehacker/full/~3/9AkqMa46INM/microsofts-official-guide-for-a-diy-raspberry-pi-power-1780388446

Original article

Carte Blanche – isolated development space with integrated fuzz testing

README.md

Carte Blanche is an isolated development space with integrated fuzz testing for your components. See them individually, explore them in different states and quickly and confidently develop them.

Screenshot of Carte Blanche

30 seconds feature video on Youtube

Setup

Setting up Carte Blanche is an easy two-step process:

  1. Install the plugin with npm install --save-dev carte-blanche

  2. Add it to the plugins in your development webpack configuration, specifying a relative path to the folder with your components in the componentRoot option:

    var CarteBlanche = require('carte-blanche');
    /* … */
    plugins: [
      new CarteBlanche({
        componentRoot: './src/components'
      })
    ],

That’s it, now start your development environment and go to /carte-blanche to see your Carte Blanche!

Options

You can specify some options for the webpack plugin:

  • componentRoot (required): Folder where your component modules are.

      plugins: [
        new CarteBlanche({
          componentRoot: 'src/components'
        })
      ]
  • dest (default: 'carte-blanche'): Change the location of your Carte Blanche. Needs to be a path.

      plugins: [
        new CarteBlanche({
          componentRoot: 'src/components',
          dest: 'components'
        })
      ]
  • plugins (default: ReactPlugin): An array of plugins to use in your Carte Blanche. (Want to write your own? See writing-plugins.md for more information!)

      var ReactPlugin = require('carte-blanche-react-plugin');
      var SourcePlugin = require('carte-blanche-source-plugin');
    
      plugins: [
        new CarteBlanche({
          componentRoot: 'src/components',
          plugins: [
           new SourcePlugin({ /* …options for the plugin here… */ }),
           new ReactPlugin()
          ]
        })
      ]
  • filter (default: matches uppercase files and uppercase folders with an index file): Regex that matches your components in the componentRoot folder. We do not recommend changing this, as it might have unintended side effects.

      plugins: [
        new CarteBlanche({
          filter: /.*.jsx$/ // Matches all files ending in .jsx
        })
      ]

This project has a custom plugin system to make it as extensible as possible. By default, we include the ReactPlugin, which has options of itself. (to pass these in you’ll have to explicitly specify it with the plugins option)

ReactPlugin Options

  • variationFolderName (default: variations): The name of the folders that stores the variation files.

    new ReactPlugin({
      variationFolderName: 'examples'
    })
  • port (default: 8082): The port the variations server runs at.

    new ReactPlugin({
      port: 7000
    })
  • hostname (default: localhost): The URL the variations server runs at.

    new ReactPlugin({
      hostname: 'mydomain.com'
    })

Plugins

This is a list of endorsed plugins that are useable right now:

Want to write your own plugin? Check out writing-plugins.md!

License

Copyright (c) 2016 Nikolaus Graf and Maximilian Stoiber, licensed under the MIT License.


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/dvc9vZ1M-Ow/carte-blanche

Original article

Varnish Website’s IT Infrastructure

One of the major reasons for the website upgrade the Varnish Project
has been going through this month, was in an effort to eat more of
our own dogfood.

The principle of eating your own dogfood is important for software
quality, that is how you experience what your users are dealing with
and I am not the least ashamed to admit that several obvious improvements
have already appeared on my TODO list as a result of this transition.

But it is also important to externalize what you learn doing so, and
therefore I thought I would document here how the projects new “internal
IT” works.

Hardware

Who cares?

Yes, we use some kind of hardware, but to be honest I don’t know what
it is.

Our primary site runs on a RootBSD ‘Omega’
virtual server somewhere near CDG/Paris.

And as backup/integration/testing server we can use any server,
virtual or physical, as long as it has a internet connection and
contemporary performance, because the entire install is scripted
and under version control (more below)

Operating System

So, dogfood: Obviously FreeBSD.

Apart from the obvious reason that I wrote a lot of FreeBSD and
can get world-class support by bugging my buddies about it, there
are two equally serious reasons for the Varnish Project to run on
FreeBSD: Dogfood and jails.

Varnish Cache is not “software for Linux”, it is software for any
competent UNIX-like operating system, and FreeBSD is our primary
“keep us honest about this” platform.

Jails

You have probably heard about Docker and Containers, but FreeBSD
have had jails
since I wrote them in 1998
and they’re a wonderful way to keep your server installation
sane.

We currently have three jails:

Script & Version Control All The Things

We have a git repos with shell scripts which create these jails
from scratch and also a script to configure the host machine
properly.

That means that the procedure to install a clone of the server
is, unabridged:

# Install FreeBSD 10.3 (if not already done by hosting)
# Configure networking (if not already done by hosting)
# Set the clock
service ntpdate forcestart
# Get git
env ASSUME_ALWAYS_YES=yes pkg install git
# Clone the private git repo
git clone ssh://example.com/root/Admin
# Edit the machines IP numbers in /etc/pf.conf
# Configure the host
sh build_host.sh |& tee _.bh
# Build the jails
foreach i (Tools Hitch Varnish)
        (cd $i ; sh build* |& tee _.bj)
end

From bare hardware to ready system in 15-30 minutes.

It goes without saying that this git repos contains stuff
like ssh host keys, so it should not go on github.

Backups

Right now there is nothing we need to backup.

When I move the mailserver/mailman/mailing lists over, those will
need to be backed up, but here the trick is to only backup the
minimal set of files, and in a “exchange” format, so that future
migrations and upgrades can slurp them in right away.

The Homepage

The new homepage is built with Sphinx
and lives in its own
github project (Pull requests
are very welcome!)

We have taken snapshots of some of the old webproperties, Trac, the
Forum etc as static HTML copies.

Why on Earth…

It is a little bit tedious to get a setup like this going, whenever
you tweak some config file, you need to remember to pull the change
back out and put it in your Admin repos.

But that extra effort pays of so many times later.

You never have to wonder “who made that change and why” or even try
to remember what changes were needed in the first place.

For us as a project, it means, that all our sysadmin people
can build a clone of our infrastructure, if they have a copy of
our “Admin” git repos and access to github.

And when FreeBSD 11
comes out, or a new version of sphinx or something else, mucking
about with things until they work can be done at leisure without
guess work.

For instance I just added the forum snapshot, by working out all
the kinks on one of my test-machines.

Once it was as I wanted it, I pushed the changes the live machine and then:

varnishadm vcl.use backup
# The 'backup' VCL does a "pass" of all trafic to my server
cd Admin
git pull
cd Tools
sh build_j_tools.sh |& tee _.bj
varnishadm vcl.load foobar varnish-live.vcl
varnishadm vcl.use foobar

For a few minutes our website was a bit slower (because of the
extra Paris-Denmark hop), but there was never any interruption.

And by doing it this way, I know it will work next time also.

2016-04-25 /phk

All that buzz about “reproducible builds” ? Yeah, not a new idea.


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/ZGGyYFVfU_Y/20160425_website.html

Original article

GCC 5.4



This is the mail archive of the
gcc-announce@gcc.gnu.org
mailing list for the GCC project.




  • From: Richard Biener
  • To: gcc-announce at gcc dot gnu dot org, gcc at gcc dot gnu dot org, info-gnu at gnu dot org
  • Date: Fri, 3 Jun 2016 16:02:37 +0200 (CEST)
  • Subject: GCC 5.4 Released
  • Authentication-results: sourceware.org; auth=none
  • Reply-to: gcc at gcc dot gnu dot org

The GNU Compiler Collection version 5.4 has been released.

GCC 5.4 is a bug-fix release from the GCC 5 branch
containing important fixes for regressions and serious bugs in
GCC 5.3 with more than 147 bugs fixed since the previous release.
This release is available from the FTP servers listed at:

  http://www.gnu.org/order/ftp.html

Please do not contact me directly regarding questions or comments
about this release.  Instead, use the resources available from
http://gcc.gnu.org.

As always, a vast number of people contributed to this GCC release
-- far too many to thank them individually!





Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/yj0fAF3lMKU/msg00001.html

Original article

Hoofbeatz Audio announces ‘i Rock N Ride’ horseback riding Bluetooth speaker

When I think of horseback riding, my mind drifts to simpler times. It also conjures thoughts of cowboys, farming, and the Amish. Since the invention of motorized vehicles, equestrian travel just seems a bit old fashioned. With all of that said, there is an apparent need to bring technology to horseback riding. How, you ask? With the Hoofbeatz Audio ‘i Rock N Ride’ Bluetooth speaker, currently on Kickstarter. You can now listen to music and answer telephone calls from the convenience of your saddle! “Riders can send or receive phone calls, ride to music and use Siri and Android voice prompts, all… [Continue Reading]


Original URL: http://feeds.betanews.com/~r/bn/~3/TEOIU-ZNNqo/

Original article

How to Listen to and Delete Everything You’ve Ever Said to Google

Here’s a fun fact: Every time you do a voice search, Google records it. And if you’re an Android user, every time you say “Ok Google,” the company records that, too. Don’t freak out, though, because Google lets you hear (and delete) these recordings. Here’s how.

Read more…


Original URL: http://feeds.gawker.com/~r/lifehacker/full/~3/fFF4HdnO-Qk/how-to-listen-to-and-delete-everything-youve-ever-said-1780366724

Original article

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑

%d bloggers like this: