A Neural Network in 11 Lines of Python

Summary: I learn best with toy code that I can play with. This tutorial teaches backpropagation via a very simple toy example, a short python implementation.

Edit: Some folks have asked about a followup article, and I’m planning to write one. I’ll tweet it out when it’s complete at @iamtrask. Feel free to follow if you’d be interested in reading it and thanks for all the feedback!

Just Give Me The Code:

X = np.array([ [0,0,1],[0,1,1],[1,0,1],[1,1,1] ])
y = np.array([[0,1,1,0]]).T
syn0 = 2*np.random.random((3,4)) - 1
syn1 = 2*np.random.random((4,1)) - 1
for j in xrange(60000):
    l1 = 1/(1+np.exp(-(np.dot(X,syn0))))
    l2 = 1/(1+np.exp(-(np.dot(l1,syn1))))
    l2_delta = (y - l2)*(l2*(1-l2))
    l1_delta = l2_delta.dot(syn1.T) * (l1 * (1-l1))
    syn1 += l1.T.dot(l2_delta)
    syn0 += X.T.dot(l1_delta)

However, this is a bit terse…. let’s break it apart into a few simple parts.

Part 1: A Tiny Toy Network

A neural network trained with backpropagation is attempting to use input to predict output.

Inputs Output
0 0 1 0
1 1 1 1
1 0 1 1
0 1 1 0

Consider trying to predict the output column given the three input columns. We could solve this problem by simply measuring statistics between the input values and the output values. If we did so, we would see that the leftmost input column is perfectly correlated with the output. Backpropagation, in its simplest form, measures statistics like this to make a model. Let’s jump right in and use it to do this.

2 Layer Neural Network:

import numpy as np

# sigmoid function
def nonlin(x,deriv=False):
    if(deriv==True):
        return x*(1-x)
    return 1/(1+np.exp(-x))
    
# input dataset
X = np.array([  [0,0,1],
                [0,1,1],
                [1,0,1],
                [1,1,1] ])
    
# output dataset            
y = np.array([[0,0,1,1]]).T

# seed random numbers to make calculation
# deterministic (just a good practice)
np.random.seed(1)

# initialize weights randomly with mean 0
syn0 = 2*np.random.random((3,1)) - 1

for iter in xrange(10000):

    # forward propagation
    l0 = X
    l1 = nonlin(np.dot(l0,syn0))

    # how much did we miss?
    l1_error = y - l1

    # multiply how much we missed by the 
    # slope of the sigmoid at the values in l1
    l1_delta = l1_error * nonlin(l1,True)

    # update weights
    syn0 += np.dot(l0.T,l1_delta)

print "Output After Training:"
print l1

Output After Training:
[[ 0.00966449]
 [ 0.00786506]
 [ 0.99358898]
 [ 0.99211957]]
Variable Definition
X Input dataset matrix where each row is a training example
y Output dataset matrix where each row is a training example
l0 First Layer of the Network, specified by the input data
l1 Second Layer of the Network, otherwise known as the hidden layer
syn0 First layer of weights, Synapse 0, connecting l0 to l1.
* Elementwise multiplication, so two vectors of equal size are multiplying corresponding values 1-to-1 to generate a final vector of identical size.
Elementwise subtraction, so two vectors of equal size are subtracting corresponding values 1-to-1 to generate a final vector of identical size.
x.dot(y) If x and y are vectors, this is a dot product. If both are matrices, it’s a matrix-matrix multiplication. If only one is a matrix, then it’s vector matrix multiplication.

As you can see in the “Output After Training”, it works!!! Before I describe processes, I recommend playing around with the code to get an intuitive feel for how it works. You should be able to run it “as is” in an ipython notebook (or a script if you must, but I HIGHLY recommend the notebook). Here are some good places to look in the code:

• Compare l1 after the first iteration and after the last iteration.
• Check out the “nonlin” function. This is what gives us a probability as output.
• Check out how l1_error changes as you iterate.
• Take apart line 36. Most of the secret sauce is here.
• Check out line 39. Everything in the network prepares for this operation.

Let’s walk through the code line by line.

Recommendation: open this blog in two screens so you can see the code while you read it. That’s kinda what I did while I wrote it. 🙂

Line 01:
This imports numpy, which is a linear algebra library. This is our only dependency.

Line 04:
This is our “nonlinearity”. While it can be several kinds of functions, this nonlinearity maps a function called a “sigmoid”. A sigmoid function maps any value to a value between 0 and 1. We use it to convert numbers to probabilities. It also has several other desirable properties for training neural networks.

Line 05:
Notice that this function can also generate the derivative of a sigmoid (when deriv=True). One of the desirable properties of a sigmoid function is that its output can be used to create its derivative. If the sigmoid’s output is a variable “out”, then the derivative is simply out * (1-out). This is very efficient.

If you’re unfamililar with derivatives, just think about it as the slope of the sigmoid function at a given point (as you can see above, different points have different slopes). For more on derivatives, check out this derivatives tutorial from Khan Academy.

Line 10:
This initializes our input dataset as a numpy matrix. Each row is a single “training example”. Each column corresponds to one of our input nodes. Thus, we have 3 input nodes to the network and 4 training examples.

Line 16:
This initializes our output dataset. In this case, I generated the dataset horizontally (with a single row and 4 columns) for space. “.T” is the transpose function. After the transpose, this y matrix has 4 rows with one column. Just like our input, each row is a training example, and each column (only one) is an output node. So, our network has 3 inputs and 1 output.

Line 20:
It’s good practice to seed your random numbers. Your numbers will still be randomly distributed, but they’ll be randomly distributed in exactly the same way each time you train. This makes it easier to see how your changes affect the network.

Line 23:
This is our weight matrix for this neural network. It’s called “syn0” to imply “synapse zero”. Since we only have 2 layers (input and output), we only need one matrix of weights to connect them. Its dimension is (3,1) because we have 3 inputs and 1 output. Another way of looking at it is that l0 is of size 3 and l1 is of size 1. Thus, we want to connect every node in l0 to every node in l1, which requires a matrix of dimensionality (3,1). 🙂

Also notice that it is initialized randomly with a mean of zero. There is quite a bit of theory that goes into weight initialization. For now, just take it as a best practice that it’s a good idea to have a mean of zero in weight initialization.

Another note is that the “neural network” is really just this matrix. We have “layers” l0 and l1 but they are transient values based on the dataset. We don’t save them. All of the learning is stored in the syn0 matrix.

Line 25:
This begins our actual network training code. This for loop “iterates” multiple times over the training code to optimize our network to the dataset.

Line 28:
Since our first layer, l0, is simply our data. We explicitly describe it as such at this point. Remember that X contains 4 training examples (rows). We’re going to process all of them at the same time in this implementation. This is known as “full batch” training. Thus, we have 4 different l0 rows, but you can think of it as a single training example if you want. It makes no difference at this point. (We could load in 1000 or 10,000 if we wanted to without changing any of the code).

Line 29:
This is our prediction step. Basically, we first let the network “try” to predict the output given the input. We will then study how it performs so that we can adjust it to do a bit better for each iteration.

This line contains 2 steps. The first matrix multiplies l0 by syn0. The second passes our output through the sigmoid function. Consider the dimensions of each:

(4 x 3) dot (3 x 1) = (4 x 1)

Matrix multiplication is ordered, such the dimensions in the middle of the equation must be the same. The final matrix generated is thus the number of rows of the first matrix and the number of columns of the second matrix.

Since we loaded in 4 training examples, we ended up with 4 guesses for the correct answer, a (4 x 1) matrix. Each output corresponds with the network’s guess for a given input. Perhaps it becomes intuitive why we could have “loaded in” an arbitrary number of training examples. The matrix multiplication would still work out. 🙂

Line 32:
So, given that l1 had a “guess” for each input. We can now compare how well it did by subtracting the true answer (y) from the guess (l1). l1_error is just a vector of positive and negative numbers reflecting how much the network missed.

Line 36:
Now we’re getting to the good stuff! This is the secret sauce! There’s a lot going on in this line, so let’s further break it into two parts.

First Part: The Derivative

nonlin(l1,True)

If l1 represents these three dots, the code above generates the slopes of the lines below. Notice that very high values such as x=2.0 (green dot) and very low values such as x=-1.0 (purple dot) have rather shallow slopes. The highest slope you can have is at x=0 (blue dot). This plays an important role. Also notice that all derivatives are between 0 and 1.

Entire Statement: The Error Weighted Derivative

l1_delta = l1_error * nonlin(l1,True)

There are more “mathematically precise” ways than “The Error Weighted Derivative” but I think that this captures the intuition. l1_error is a (4,1) matrix. nonlin(l1,True) returns a (4,1) matrix. What we’re doing is multiplying them “elementwise”. This returns a (4,1) matrix l1_delta with the multiplied values.

When we multiply the “slopes” by the error, we are reducing the error of high confidence predictions. Look at the sigmoid picture again! If the slope was really shallow (close to 0), then the network either had a very high value, or a very low value. This means that the network was quite confident one way or the other. However, if the network guessed something close to (x=0, y=0.5) then it isn’t very confident. We update these “wishy-washy” predictions most heavily, and we tend to leave the confident ones alone by multiplying them by a number close to 0.

Line 39:
We are now ready to update our network! Let’s take a look at a single training example.

In this training example, we’re all setup to update our weights. Let’s update the far left weight (9.5).

weight_update = input_value * l1_delta

For the far left weight, this would multiply 1.0 * the l1_delta. Presumably, this would increment 9.5 ever so slightly. Why only a small ammount? Well, the prediction was already very confident, and the prediction was largely correct. A small error and a small slope means a VERY small update. Consider all the weights. It would ever so slightly increase all three.

However, because we’re using a “full batch” configuration, we’re doing the above step on all four training examples. So, it looks a lot more like the image above. So, what does line 39 do? It computes the weight updates for each weight for each training example, sums them, and updates the weights, all in a simple line. Play around with the matrix multiplication and you’ll see it do this!

Takeaways:

So, now that we’ve looked at how the network updates, let’s look back at our training data and reflect. When both an input and a output are 1, we increase the weight between them. When an input is 1 and an output is 0, we decrease the weight between them.

Inputs Output
0 0 1 0
1 1 1 1
1 0 1 1
0 1 1 0

Thus, in our four training examples below, the weight from the first input to the output would consistently increment or remain unchanged, whereas the other two weights would find themselves both increasing and decreasing across training examples (cancelling out progress). This phenomenon is what causes our network to learn based on correlations between the input and output.

Part 2: A Slightly Harder Problem

Inputs Output
0 0 1 0
0 1 1 1
1 0 1 1
1 1 1 0

Consider trying to predict the output column given the two input columns. A key takeway should be that neither columns have any correlation to the output. Each column has a 50% chance of predicting a 1 and a 50% chance of predicting a 0.

So, what’s the pattern? It appears to be completely unrelated to column three, which is always 1. However, columns 1 and 2 give more clarity. If either column 1 or 2 are a 1 (but not both!) then the output is a 1. This is our pattern.

This is considered a “nonlinear” pattern because there isn’t a direct one-to-one relationship between the input and output. Instead, there is a one-to-one relationship between a combination of inputs, namely columns 1 and 2.

Believe it or not, image recognition is a similar problem. If one had 100 identically sized images of pipes and bicycles, no individual pixel position would directly correlate with the presence of a bicycle or pipe. The pixels might as well be random from a purely statistical point of view. However, certain combinations of pixels are not random, namely the combination that forms the image of a bicycle or a person.

Our Strategy

In order to first combine pixels into something that can then have a one-to-one relationship with the output, we need to add another layer. Our first layer will combine the inputs, and our second layer will then map them to the output using the output of the first layer as input. Before we jump into an implementation though, take a look at this table.

Inputs (l0) Hidden Weights (l1) Output (l2)
0 0 1 0.1 0.2 0.5 0.2 0
0 1 1 0.2 0.6 0.7 0.1 1
1 0 1 0.3 0.2 0.3 0.9 1
1 1 1 0.2 0.1 0.3 0.8 0

If we randomly initialize our weights, we will get hidden state values for layer 1. Notice anything? The second column (second hidden node), has a slight correlation with the output already! It’s not perfect, but it’s there. Believe it or not, this is a huge part of how neural networks train. (Arguably, it’s the only way that neural networks train.) What the training below is going to do is amplify that correlation. It’s both going to update syn1 to map it to the output, and update syn0 to be better at producing it from the input!

Note: The field of adding more layers to model more combinations of relationships such as this is known as “deep learning” because of the increasingly deep layers being modeled.

3 Layer Neural Network:

import numpy as np

def nonlin(x,deriv=False):
	if(deriv==True):
	    return x*(1-x)

	return 1/(1+np.exp(-x))
    
X = np.array([[0,0,1],
            [0,1,1],
            [1,0,1],
            [1,1,1]])
                
y = np.array([[0],
			[1],
			[1],
			[0]])

np.random.seed(1)

# randomly initialize our weights with mean 0
syn0 = 2*np.random.random((3,4)) - 1
syn1 = 2*np.random.random((4,1)) - 1

for j in xrange(60000):

	# Feed forward through layers 0, 1, and 2
    l0 = X
    l1 = nonlin(np.dot(l0,syn0))
    l2 = nonlin(np.dot(l1,syn1))

    # how much did we miss the target value?
    l2_error = y - l2
    
    if (j% 10000) == 0:
        print "Error:" + str(np.mean(np.abs(l2_error)))
        
    # in what direction is the target value?
    # were we really sure? if so, don't change too much.
    l2_delta = l2_error*nonlin(l2,deriv=True)

    # how much did each l1 value contribute to the l2 error (according to the weights)?
    l1_error = l2_delta.dot(syn1.T)
    
    # in what direction is the target l1?
    # were we really sure? if so, don't change too much.
    l1_delta = l1_error * nonlin(l1,deriv=True)

    syn1 += l1.T.dot(l2_delta)
    syn0 += l0.T.dot(l1_delta)

<!–

Runtime Output:

–>

Error:0.496410031903
Error:0.00858452565325
Error:0.00578945986251
Error:0.00462917677677
Error:0.00395876528027
Error:0.00351012256786
Variable Definition
X Input dataset matrix where each row is a training example
y Output dataset matrix where each row is a training example
l0 First Layer of the Network, specified by the input data
l1 Second Layer of the Network, otherwise known as the hidden layer
l2 Final Layer of the Network, which is our hypothesis, and should approximate the correct answer as we train.
syn0 First layer of weights, Synapse 0, connecting l0 to l1.
syn1 Second layer of weights, Synapse 1 connecting l1 to l2.
l2_error This is the amount that the neural network “missed”.
l2_delta This is the error of the network scaled by the confidence. It’s almost identical to the error except that very confident errors are muted.
l1_error Weighting l2_delta by the weights in syn1, we can calculate the error in the middle/hidden layer.
l1_delta This is the l1 error of the network scaled by the confidence. Again, it’s almost identical to the l1_error except that confident errors are muted.

Recommendation: open this blog in two screens so you can see the code while you read it. That’s kinda what I did while I wrote it. 🙂

Everything should look very familiar! It’s really just 2 of the previous implementation stacked on top of each other. The output of the first layer (l1) is the input to the second layer. The only new thing happening here is on line 43.

Line 43: uses the “confidence weighted error” from l2 to establish an error for l1. To do this, it simply sends the error across the weights from l2 to l1. This gives what you could call a “contribution weighted error” because we learn how much each node value in l1 “contributed” to the error in l2. This step is called “backpropagating” and is the namesake of the algorithm. We then update syn0 using the same steps we did in the 2 layer implementation.

Part 3: Conclusion and Future Work

My Recommendation:

If you’re serious about neural networks, I have one recommendation. Try to rebuild this network from memory. I know that might sound a bit crazy, but it seriously helps. If you want to be able to create arbitrary architectures based on new academic papers or read and understand sample code for these different architectures, I think that it’s a killer exercise. I think it’s useful even if you’re using frameworks like Torch, Caffe, or Theano. I worked with neural networks for a couple years before performing this exercise, and it was the best investment of time I’ve made in the field (and it didn’t take long).

Future Work

This toy example still needs quite a few bells and whistles to really approach the state-of-the-art architectures. Here’s a few things you can look into if you want to further improve your network. (Perhaps I will in a followup post.)

• Alpha
Bias Units
Mini-Batches
• Delta Trimming
Parameterized Layer Sizes
• Regularization
Dropout
Momentum
Batch Normalization
• GPU Compatability
• Other Awesomeness You Implement

Want to Work in Machine Learning?

One of the best things you can do to learn Machine Learning is to have a job where you’re practicing Machine Learning professionally. I’d encourage you to check out the positions at Digital Reasoning in your job hunt. If you have questions about any of the positions or about life at Digital Reasoning, feel free to send me a message on my LinkedIn. I’m happy to hear about where you want to go in life, and help you evaluate whether Digital Reasoning could be a good fit.

If none of the positions above feel like a good fit. Continue your search! Machine Learning expertise is one of the most valuable skills in the job market today, and there are many firms looking for practitioners. Perhaps some of these services below will help you in your hunt.

View More Job Search Results


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/3JMwKnxyE8k/

Original article

1000 nodes and beyond: updates to Kubernetes performance and scalability in 1.2

Editor’s note: this is the first in a series of in-depth posts on what’s new in Kubernetes 1.2

We’re proud to announce that with the release of 1.2, Kubernetes now supports 1000-node clusters, with a reduction of 80% in 99th percentile tail latency for most API operations. This means in just six months, we’ve increased our overall scale by 10 times while maintaining a great user experience  the 99th percentile pod startup times are less than 3 seconds, and 99th percentile latency of most API operations is tens of milliseconds (the exception being LIST operations, which take hundreds of milliseconds in very large clusters).

Words are fine, but nothing speaks louder than a demo. Check this out!

In the above video, you saw the cluster scale up to 10 M queries per second (QPS) over 1,000 nodes, including a rolling update, with zero downtime and no impact to tail latency. That’s big enough to be one of the top 100 sites on the Internet!

In this blog post, we’ll cover the work we did to achieve this result, and discuss some of our future plans for scaling even higher.

Methodology 

We benchmark Kubernetes scalability against the following Service Level Objectives (SLOs):

  1. API responsiveness1: 99% of all API calls return in less than 1s 
  2. Pod startup time: 99% of pods and their containers (with pre-pulled images) start within 5s. 

We say Kubernetes scales to a certain number of nodes only if both of these SLOs are met.

We continuously collect and report the measurements described above as part of the project test framework. This battery of tests breaks down into two parts: API responsiveness and Pod Startup Time.

API responsiveness for user-level abstractions2 

Kubernetes offers high-level abstractions for users to represent their applications. For example, the ReplicationController is an abstraction representing a collection of pods. Listing all ReplicationControllers or listing all pods from a given ReplicationController is a very common use case. On the other hand, there is little reason someone would want to list all pods in the system  for example, 30,000 pods (1000 nodes with 30 pods per node) represent ~150MB of data (~5kB/pod * 30k pods). So this test uses ReplicationControllers.

For this test (assuming N to be number of nodes in the cluster), we:

  1. Create roughly 3xN ReplicationControllers of different sizes (5, 30 and 250 replicas), which altogether have 30xN replicas. We spread their creation over time (i.e. we don’t start all of them at once) and wait until all of them are running. 
  2. Perform a few operations on every ReplicationController (scale it, list all its instances, etc.), spreading those over time, and measuring the latency of each operation. This is similar to what a real user might do in the course of normal cluster operation. 
  3. Stop and delete all ReplicationControllers in the system. 

For results of this test see the “Metrics for Kubernetes 1.2” section below.

For the v1.3 release, we plan to extend this test by also creating Services, Deployments, DaemonSets, and other API objects.

Pod startup end-to-end latency3 

Users are also very interested in how long it takes Kubernetes to schedule and start a pod. This is true not only upon initial creation, but also when a ReplicationController needs to create a replacement pod to take over from one whose node failed.

We (assuming N to be the number of nodes in the cluster):

  1. Create a single ReplicationController with 30xN replicas and wait until all of them are running. We are also running high-density tests, with 100xN replicas, but with fewer nodes in the cluster. 
  2. Launch a series of single-pod ReplicationControllers – one every 200ms. For each, we measure “total end-to-end startup time” (defined below). 
  3. Stop and delete all pods and replication controllers in the system. 

We define “total end-to-end startup time” as the time from the moment the client sends the API server a request to create a ReplicationController, to the moment when “running & ready” pod status is returned to the client via watch. That means that “pod startup time” includes the ReplicationController being created and in turn creating a pod, scheduler scheduling that pod, Kubernetes setting up intra-pod networking, starting containers, waiting until the pod is successfully responding to health-checks, and then finally waiting until the pod has reported its status back to the API server and then API server reported it via watch to the client.

While we could have decreased the “pod startup time” substantially by excluding for example waiting for report via watch, or creating pods directly rather than through ReplicationControllers, we believe that a broad definition that maps to the most realistic use cases is the best for real users to understand the performance they can expect from the system.

Metrics from Kubernetes 1.2 

So what was the result?We run our tests on Google Compute Engine, setting the size of the master VM based on on the size of the Kubernetes cluster. In particular for 1000-node clusters we use a n1-standard-32 VM for the master (32 cores, 120GB RAM).

API responsiveness 

The following two charts present a comparison of 99th percentile API call latencies for the Kubernetes 1.2 release and the 1.0 release on 100-node clusters. (Smaller bars are better)


We present results for LIST operations separately, since these latencies are significantly higher. Note that we slightly modified our tests in the meantime, so running current tests against v1.0 would result in higher latencies than they used to.

We also ran these tests against 1000-node clusters. Note: We did not support clusters larger than 100 on GKE, so we do not have metrics to compare these results to. However, customers have reported running on 1,000+ node clusters since Kubernetes 1.0.

Since LIST operations are significantly larger, we again present them separately:

All latencies, in both cluster sizes, are well within our 1 second SLO.

Pod startup end-to-end latency 

The results for “pod startup latency” (as defined in the “Pod-Startup end-to-end latency” section) are presented in the following graph. For reference we are presenting also results from v1.0 for 100-node clusters in the first part of the graph.

As you can see, we substantially reduced tail latency in 100-node clusters, and now deliver low pod startup latency up to the largest cluster sizes we have measured. It is noteworthy that the metrics for 1000-node clusters, for both API latency and pod startup latency, are generally better than those reported for 100-node clusters just six months ago!

How did we make these improvements? 

To make these significant gains in scale and performance over the past six months, we made a number of improvements across the whole system. Some of the most important ones are listed below.

  • Created a “read cache” at the API server level 

    (https://github.com/kubernetes/kubernetes/issues/15945 )

    Since most Kubernetes control logic operates on an ordered, consistent snapshot kept up-to-date by etcd watches (via the API server), a slight delay in that arrival of that data has no impact on the correct operation of the cluster. These independent controller loops, distributed by design for extensibility of the system, are happy to trade a bit of latency for an increase in overall throughput.

    In Kubernetes 1.2 we exploited this fact to improve performance and scalability by adding an API server read cache. With this change, the API server’s clients can read data from an in-memory cache in the API server instead of reading it from etcd. The cache is updated directly from etcd via watch in the background. Those clients that can tolerate latency in retrieving data (usually the lag of cache is on the order of tens of milliseconds) can be served entirely from cache, reducing the load on etcd and increasing the throughput of the server. This is a continuation of an optimization begun in v1.1, where we added support for serving watch directly from the API server instead of etcd:
    https://github.com/kubernetes/kubernetes/blob/master/docs/proposals/apiserver-watch.md

  • Thanks to contributions from Wojciech Tyczynski at Google and Clayton Coleman and Timothy St. Clair at Red Hat, we were able to join careful system design with the unique advantages of etcd to improve the scalability and performance of Kubernetes. 

  • Introduce a “Pod Lifecycle Event Generator” (PLEG) in the Kubelet (https://github.com/kubernetes/kubernetes/blob/master/docs/proposals/pod-lifecycle-event-generator.md

    Kubernetes 1.2 also improved density from a pods-per-node perspective  for v1.2 we test and advertise up to 100 pods on a single node (vs 30 pods in the 1.1 release). This improvement was possible because of diligent work by the Kubernetes community through an implementation of the Pod Lifecycle Event Generator (PLEG).

    The Kubelet (the Kubernetes node agent) has a worker thread per pod which is responsible for managing the pod’s lifecycle. In earlier releases each worker would periodically poll the underlying container runtime (Docker) to detect state changes, and perform any necessary actions to ensure the node’s state matched the desired state (e.g. by starting and stopping containers). As pod density increased, concurrent polling from each worker would overwhelm the Docker runtime, leading to serious reliability and performance issues (including additional CPU utilization which was one of the limiting factors for scaling up).

    To address this problem we introduced a new Kubelet subcomponent  the PLEG  to centralize state change detection and generate lifecycle events for the workers. With concurrent polling eliminated, we were able to lower the steady-state CPU usage of Kubelet and the container runtime by 4x. This also allowed us to adopt a shorter polling period, so as to detect and react to changes more quickly. 

  • Improved scheduler throughput
    Kubernetes community members from CoreOS (Hongchao Deng and Xiang Li) helped to dive deep into the Kubernetes scheduler and dramatically improve throughput without sacrificing accuracy or flexibility. They cut total time to schedule 30,000 pods by nearly 1400%! You can read a great blog post on how they approached the problem here: https://coreos.com/blog/improving-kubernetes-scheduler-performance.html 
  • A more efficient JSON parser
    Go’s standard library includes a flexible and easy-to-use JSON parser that can encode and decode any Go struct using the reflection API. But that flexibility comes with a cost reflection allocates lots of small objects that have to be tracked and garbage collected by the runtime. Our profiling bore that out, showing that a large chunk of both client and server time was spent in serialization. Given that our types don’t change frequently, we suspected that a significant amount of reflection could be bypassed through code generation.

    After surveying the Go JSON landscape and conducting some initial tests, we found the ugorji codec library offered the most significant speedups – a 200% improvement in encoding and decoding JSON when using generated serializers, with a significant reduction in object allocations. After contributing fixes to the upstream library to deal with some of our complex structures, we switched Kubernetes and the go-etcd client library over. Along with some other important optimizations in the layers above and below JSON, we were able to slash the cost in CPU time of almost all API operations, especially reads. 

Kubernetes 1.3 and Beyond 

Of course, our job is not finished. We will continue to invest in improving Kubernetes performance, as we would like it to scale to many thousands of nodes, just like Google’s Borg. Thanks to our investment in testing infrastructure and our focus on how teams use containers in production, we have already identified the next steps on our path to improving scale. 
On deck for Kubernetes 1.3: 
  1.  Our main bottleneck is still the API server, which spends the majority of its time just marshaling and unmarshaling JSON objects.We plan to add support for protocol buffers to the API as an optional path for inter-component communication and for storing objects in etcd. Users will still be able to use JSON to communicate with the API server, but the since majority of Kubernetes communication is intra-cluster (API server to node, scheduler to API server, etc.) we expect a significant reduction in CPU and memory usage on the master. 
  2.  Kubernetes uses labels to identify sets of objects; For example, identifying which pods belong to a given ReplicationController requires iterating over all pods in a namespace and choosing those that match the controller’s label selector. The addition of an efficient indexer for labels that can take advantage of the existing API object cache will make it possible to quickly find the objects that match a label selector, making this common operation much faster. 
  3. Scheduling decisions are based on a number of different factors, including spreading pods based on requested resources, spreading pods with the same selectors (e.g. from the same Service, ReplicationController, Job, etc.), presence of needed container images on the node, etc. Those calculations, in particular selector spreading, have many opportunities for improvement  see https://github.com/kubernetes/kubernetes/issues/22262 for just one suggested change. 
  4. We are also excited about the upcoming etcd v3.0 release, which was designed with Kubernetes use case in mind  it will both improve performance and introduce new features. Contributors from CoreOS have already begun laying the groundwork for moving Kubernetes to etcd v3.0 (see https://github.com/kubernetes/kubernetes/pull/22604). 

While this list does not capture all the ways efforts around performance, we are optimistic we will achieve as big a performance gain as we saw going from Kubernetes 1.0 to 1.2. 


Conclusion 

In the last six months we’ve significantly improved Kubernetes scalability, allowing v1.2 to run 1000-node clusters with the same excellent responsiveness (as measured by our SLOs) as we were previously achieving only on much smaller clusters. But that isn’t enough  we want to push Kubernetes even further and faster. Kubernetes v1.3 will improve the system’s scalability and responsiveness further, while continuing to add features that make it easier to build and run the most demanding container-based applications. 
Please join our community and help us build the future of Kubernetes! There are many ways to participate. If you’re particularly interested in scalability, you’ll be interested in: 

 And of course for more information about the project in general, go to www.kubernetes.io

Wojciech Tyczynski, Software Engineer, Google


1We exclude operations on “events” since these are more like system logs and are not required for the system to operate properly.

2This is test/e2e/load.go from the Kubernetes github repository.
3This is test/e2e/density.go test from the Kubernetes github repository 
4We are looking into optimizing this in the next release, but for now using a smaller master can result in significant (order of magnitude) performance degradation. We encourage anyone running benchmarking against Kubernetes or attempting to replicate these findings to use a similarly sized master, or performance will suffer.


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/4hpVMpB4VwU/1000-nodes-and-beyond-updates-to-Kubernetes-performance-and-scalability-in-12.html

Original article

A List of Isaac Asimov’s Books

Here’s my list of Isaac Asimov’s book titles. The numbering was provided by Asimov (except for a few marked with an asterisk). After that the titles appear in roughly the order they were written, although this is an educated guess.
Note that there was often a delay between the completion and the publication
of a title, which is why there is not a strict progression in the year of
publication. Notes are delimited by numbers in brackets, and follow at the
end. The total number of entries is 506. If you have any questions or comments,
please drop me a line.

Ed Seiler

ejseiler@earthlink.net

To the Isaac Asimov home page
To the Isaac Asimov FAQ
To Jenkins’ Spoiler-Laden Guide to Isaac Asimov

  #  TITLE                                     PUBLISHER    YEAR OF PUBLICATION
--------------------------------------------------------------------------------
  1  Pebble In The Sky                         Doubleday                   1950 
  2  I, Robot                                  Gnome Press [1]             1950 
  3  The Stars, Like Dust-- (Tyrann)           Doubleday                   1951 
  4  Foundation                                Gnome Press [1]             1951 
  5  David Starr, Space Ranger [2]             Doubleday                   1952 
  6  Foundation and Empire                     Gnome Press [1]             1952 
  7  The Currents of Space                     Doubleday                   1952 
  8  Biochemistry and Human Metabolism [3]     Williams & Wilkins          1952 
  9  Second Foundation                         Gnome Press [1]             1953 
 10  Lucky Starr and the Pirates of the
      Asteroids [2]                            Doubleday                   1953 
 11  The Caves of Steel                        Doubleday                   1954 
 12  Lucky Starr and the Oceans of Venus [2]   Doubleday                   1954 
 13  The Chemicals of Life: Enzymes,
      Vitamins, and Hormones                   Abelard-Schuman             1954 
 14  The Martian Way and Other Stories         Doubleday                   1955 
 15  The End of Eternity                       Doubleday                   1955 
 16  Races and People [4]                      Abelard-Schuman             1955 
 17  Lucky Starr and the Big Sun of
      Mercury [2]                              Doubleday                   1956 
 18  Chemistry and Human Health [5]            McGraw-Hill                 1956 
 19  Inside The Atom                           Abelard-Schuman             1956 
 20  The Naked Sun                             Doubleday                   1957 
 21  Lucky Starr and the Moons of Jupiter [2]  Doubleday                   1957 
 22  Building Blocks of the Universe           Abelard-Schuman             1957 
 23  Earth Is Room Enough: Science Fiction
      Tales of Our Own Planet                  Doubleday                   1957 
 24  Only a Trillion                           Abelard-Schuman             1957 
 25  The World of Carbon                       Abelard-Schuman             1958 
 26  Lucky Starr and the Rings of Saturn [2]   Doubleday                   1958 
 27  The World of Nitrogen                     Abelard-Schuman             1958 
 28  The Death Dealers (A Whiff of Death)      Avon                        1958 
 29  Nine Tomorrows: Tales of the Near Future  Doubleday                   1959 
 30  The Clock We Live On                      Abelard-Schuman             1959 
 31  Words of Science, and the History Behind
      Them                                     Houghton Mifflin            1959 
 32  Realm of Numbers                          Houghton Mifflin            1959 
 33  The Living River                          Abelard-Schuman             1960 
 34  The Kingdom of the Sun                    Abelard-Schuman             1960 
 35  Realm of Measure                          Houghton Mifflin            1960 
 36  Breakthroughs in Science                  Houghton Mifflin            1960 
 37  Satellites in Outer Space                 Random House                1960 
 38  The Wellsprings of Life                   Abelard-Schuman             1960 
 39  The Intelligent Man's Guide to Science    Basic Books                 1960 
 40  The Double Planet                         Abelard-Schuman             1960 
 41  Words from the Myths                      Houghton Mifflin            1961 
 42  Realm of Algebra                          Houghton Mifflin            1961 
 43  Life and Energy                           Doubleday                   1962 
 44  Words in Genesis                          Houghton Mifflin            1962 
 45  Fact and Fancy                            Doubleday                   1962 
 46  Words on the Map                          Houghton Mifflin            1962 
 47  The Hugo Winners [6]                      Doubleday                   1962 
 48  The Search For The Elements               Basic Books                 1962 
 49  Words from the Exodus                     Houghton Mifflin            1963 
 50  The Genetic Code                          Orion Press                 1963 
 51  The Human Body: Its Structure and
      Operation                                Houghton Mifflin            1963 
 52  Fifty Short Science Fiction Tales [7]     Collier                     1963 
 53  View from a Height                        Doubleday                   1963 
 54  The Kite That Won the Revolution          Houghton Mifflin            1963 
 55  The Human Brain: Its Capacities and
      Functions                                Houghton Mifflin            1964 
 56  A Short History of Biology                Natural History Press [8]   1964 
 57  Quick and Easy Math                       Houghton Mifflin            1964 
 58  Adding a Dimension                        Doubleday                   1964 
 59  Planets For Man [9]                       Random House                1964 
 60  The Rest of the Robots                    Doubleday                   1964 
 61  Asimov's Biographical Encyclopedia of
      Science and Technology, 1st Ed.          Doubleday                   1964 
 62  A Short History of Chemistry              Doubleday                   1965 
 63  The Greeks: A Great Adventure             Houghton Mifflin            1965 
 64  Of Time and Space and Other Things        Doubleday                   1965 
 65  The New Intelligent Man's Guide to
      Science                                  Basic Books                 1965 
 66  An Easy Introduction to the Slide Rule    Houghton Mifflin            1965 
 67  Fantastic Voyage                          Houghton Mifflin            1966 
 68  The Noble Gases                           Basic Books                 1966 
 69  Inside The Atom (3rd revised edition)     Abelard-Schuman             1966 
 70  The Neutrino: Ghost Particle of the Atom  Doubleday                   1966 
 71  The Roman Republic                        Houghton Mifflin            1966 
 72  Understanding Physics, Volume I           Walker                      1966 
 73  Understanding Physics, Volume II          Walker                      1966 
 74  Understanding Physics, Volume III         Walker                      1966 
 75  The Genetic Effects of Radiation [10]     U.S. AEC                    1966 
 76  Tomorrow's Children: Eighteen Tales of
      Fantasy and Science Fiction [6]          Doubleday                   1966 
 77  The Universe: From Flat Earth to Quasar   Walker                      1966 
 78  From Earth to Heaven                      Doubleday                   1966 
 79  The Moon                                  Follet                      1967 
 80  Environments Out There                    Scholastic/Abelard-Schuman  1967 
 81  The Roman Empire                          Houghton Mifflin            1967 
 82  Through a Glass, Clearly                  New English Library         1967 
 83  Is Anyone There?                          Doubleday                   1967 
 84  To the Ends of the Universe               Walker                      1967 
 85  Mars                                      Follet                      1967 
 86  The Egyptians                             Houghton Mifflin            1967 
 87  Asimov's Mysteries                        Doubleday                   1968 
 88  Science, Numbers, and I                   Doubleday                   1968 
 89  Stars                                     Follet                      1968 
 90  Galaxies                                  Follet                      1968 
 91  The Near East: 10,000 Years of History    Houghton Mifflin            1968 
 92  The Dark Ages                             Houghton Mifflin            1968 
 93  Asimov's Guide To The Bible, Volume I     Doubleday                   1968 
 94  Words from History                        Houghton Mifflin            1968 
 95  Photosynthesis                            Basic Books                 1969 
 96  The Shaping of England                    Houghton Mifflin            1969 
 97  Twentieth Century Discovery               Doubleday                   1969 
 98  Nightfall and Other Stories               Doubleday                   1969 
 99  Asimov's Guide To The Bible, Volume II    Doubleday                   1969 
100  Opus 100                                  Houghton Mifflin            1969 
101  ABC's of Space                            Walker                      1969 
102  Great Ideas of Science                    Houghton Mifflin            1969 
103  The Solar System and Back                 Doubleday                   1970 
104  Asimov's Guide To Shakespeare, Volume I   Doubleday                   1970 
105  Asimov's Guide To Shakespeare, Volume II  Doubleday                   1970 
106  Constantinople: The Forgotten Empire      Houghton Mifflin            1970 
107  ABC's of the Ocean                        Walker                      1970 
108  Light                                     Follet                      1970 
109  The Stars in Their Courses                Doubleday                   1971 
110  Where Do We Go from Here? [6]             Doubleday                   1971 
111  What Makes the Sun Shine?                 Little, Brown & Co.         1971 
112  The Sensuous Dirty Old Man                Walker                      1971 
113  The Best New Thing                        World Pub. Co.              1971 
114  Isaac Asimov's Treasury of Humor          Houghton Mifflin            1971 
115  The Hugo Winners, Volume II [6]           Doubleday                   1971 
116  The Land of Canaan                        Houghton Mifflin            1971 
117  ABC's of the Earth                        Walker                      1971 
118  Asimov's Biographical Encyclopedia of
      Science and Technology, New Rev. Ed.     Doubleday                   1972 
119  The Left Hand of the Electron             Doubleday                   1972 
120  Asimov's Guide to Science                 Basic Books                 1972 
121  The Gods Themselves                       Doubleday                   1972 
122  More Words of Science                     Houghton Mifflin            1972 
123  Electricity and Man                       U.S. AEC                    1972 
124  ABC's of Ecology                          Walker                      1972 
125  The Early Asimov or, Eleven Years of
      Trying                                   Doubleday                   1972 
126  The Shaping of France                     Houghton Mifflin            1972 
127  The Story of Ruth                         Doubleday                   1972 
128  Ginn Science Program, Int. Level A        Ginn                        1972 
129  Ginn Science Program, Int. Level C        Ginn                        1972 
130  Asimov's Annotated "Don Juan"             Doubleday                   1972 
131  Worlds Within Worlds                      U.S. AEC                    1972 
132  Ginn Science Program, Int. Level B        Ginn                        1972 
133  How Did We Find Out the Earth Is Round?   Walker                      1973 
134  Comets and Meteors                        Follet                      1973 
135  The Sun                                   Follet                      1973 
136  How Did We Find Out About Electricity?    Walker                      1973 
137  The Shaping of North America: From
      Earliest Times to 1763                   Houghton Mifflin            1973 
138  Today and Tomorrow and...                 Doubleday                   1973 
139  Jupiter, the Largest Planet               Lothrop, Lee, & Shepard     1973 
140  Ginn Science Program, Adv. Level A        Ginn                        1973 
141  Ginn Science Program, Adv. Level B        Ginn                        1973 
142  How Did We Find Out About Numbers?        Walker                      1973 
143  Please Explain                            Houghton Mifflin            1973 
144  The Tragedy of the Moon                   Abelard-Schuman             1973 
145  How Did We Find Out About Dinosaurs?      Walker                      1973 
146  The Best of Isaac Asimov                  Sphere                      1973 
147  Nebula Award Stories Eight [6]            Harper & Row                1973 
148  Asimov on Astronomy                       Doubleday                   1974 
149  The Birth of the United States            Houghton Mifflin            1974 
150  Have You Seen These?                      NESRAA                      1974 
151  Before The Golden Age: A Science Fiction
      Anthology of the 1930s [6]               Doubleday                   1974 
152  Our World in Space                        New York Graphic Society    1974 
153  How Did We Find Out About Germs?          Walker                      1974 
154  Asimov's Annotated "Paradise Lost"        Doubleday                   1974 
155  Tales of the Black Widowers               Doubleday                   1974 
156  Earth: Our Crowded Spaceship              John Day                    1974 
157  Asimov on Chemistry                       Doubleday                   1974 
158  How Did We Find Out About Vitamins?       Walker                      1974 
159  Of Matters Great and Small                Doubleday                   1975 
160  The Solar System                          Follet                      1975 
161  Our Federal Union                         Houghton Mifflin            1975 
162  How Did We Find Out About Comets?         Walker                      1975 
163  Science Past, Science Future              Doubleday                   1975 
164  Buy Jupiter and Other Stories             Doubleday                   1975 
165  Eyes on the Universe: A History of the
      Telescope                                Houghton Mifflin            1975 
166  Lecherous Limericks                       Walker                      1975 
167  The Heavenly Host                         Walker                      1975 
168  The Ends of the Earth: The Polar Regions
      of the World                             Weybright & Talley          1975 
169  How Did We Find Out About Energy?         Walker                      1975 
170  "The Dream", "Benjamin's Dream", and 
      "Benjamin's Bicentennial Blast"          Benjamin Franklin Keeps.    1976 
171  Asimov on Physics                         Doubleday                   1976 
172  Murder at The ABA (Authorised Murder [U.K.])
                                               Doubleday                   1976 
173  How Did We Find Out About Atoms?          Walker                      1976 
174  Good Taste                                Apocalypse Press            1976 
175  The Planet That Wasn't                    Doubleday                   1976 
176  The Bicentennial Man and Other Stories    Doubleday                   1976 
177  More Lecherous Limericks                  Walker                      1976 
178  More Tales of the Black Widowers          Doubleday/Crime Club        1976 
179  Alpha Centauri, the Nearest Star          Lothrop, Lee, & Shepard     1976 
180  How Did We Find Out About Nuclear Power?  Walker                      1976 
181  Familiar Poems Annotated                  Doubleday                   1977 
182  The Collapsing Universe: The Story of
      Black Holes                              Walker                      1977 
183  Asimov on Numbers                         Doubleday                   1977 
184  How Did We Find Out About Outer Space?    Walker                      1977 
185  Still More Lecherous Limericks            Walker                      1977 
186  The Hugo Winners, Volume III [6]          Doubleday                   1977 
187  The Beginning and the End                 Doubleday                   1977 
188  Mars, the Red Planet                      Lothrop, Lee, & Shepard     1977 
189  The Golden Door                           Houghton Mifflin            1977 
190  The Key Word and Other Mysteries          Walker                      1977 
191  Asimov's Sherlockian Limericks            Mysterious                  1977 
192  One Hundred Great Science Fiction
      Short-Short Stories [11]                 Doubleday                   1978 
193  Quasar, Quasar, Burning Bright            Doubleday                   1978 
194  How Did We Find Out About Earthquakes?    Walker                      1978 
195  Animals of the Bible                      Doubleday                   1978 
196  Limericks: Too Gross; or Two Dozen
      Dirty Stanzas [12]                       W. W. Norton                1978 
197  How Did We Find Out About Black Holes?    Walker                      1978 
198  Life and Time                             Doubleday                   1978 
199  Saturn and Beyond                         Lothrop, Lee, & Shepard     1979 
200  Opus 200                                  Houghton Mifflin            1979 
201  In Memory Yet Green                       Doubleday                   1979 
202  Isaac Asimov Presents the Great SF
      Stories, 1: 1939 [13]                    DAW Books                   1979 
203  Extraterrestrial Civilizations            Crown                       1979 
204  How Did We Find Out About Our Human
      Roots?                                   Walker                      1979 
205  Isaac Asimov Presents the Great SF
      Stories, 2: 1940 [13]                    DAW Books                   1979 
206  The Road to Infinity                      Doubleday                   1979 
207  A Choice of Catastrophes                  Simon & Schuster            1979 
208  The Science Fictional Solar System [14]   Harper & Row                1979 
209  The Thirteen Crimes of Science
      Fiction [14]                             Doubleday                   1979 
210  Isaac Asimov's Book of Facts              Grosset & Dunlap            1979 
211  How Did We Find Out About Antarctica?     Walker                      1979 
212  Casebook of the Black Widowers            Doubleday                   1980 
213  The Future in Question [11]               Fawcett Crest               1980 
214  Isaac Asimov Presents the Great SF
      Stories, 3: 1941 [13]                    DAW Books                   1980 
215  How Did We Find Out About Oil?            Walker                      1980 
216  In Joy Still Felt                         Doubleday                   1980 
217  Who Done It? [15]                         Houghton Mifflin            1980 
218  Space Mail [11]                           Fawcett Crest               1980 
219  Microcosmic Tales: 100 Wondrous Science
      Fiction Short-Short Stories [11]         Taplinger                   1980 
220  Isaac Asimov Presents the Great SF
      Stories, 4: 1942 [13]                    DAW Books                   1980 
221  The Seven Deadly Sins of Science
      Fiction [16]                             Fawcett Crest               1980 
222  The Annotated "Gulliver's Travels"        Clarkson N. Potter          1980 
223  How Did We Find Out About Coal?           Walker                      1980 
224  The Future I [11]                         Fawcett Crest               1981 
225  In the Beginning                          Crown/Stonesong Press       1981 
226  Isaac Asimov Presents the Great SF
     Stories, 5: 1943 [13]                     DAW Books                   1981 
227  Asimov on Science Fiction                 Doubleday                   1981 
228  Venus, Near Neighbor of the Sun           Lothrop, Lee, & Shepard     1981 
229  Three by Asimov                           Targ                        1981 
230  How Did We Find Out About Solar Power?    Walker                      1981 
231  How Did We Find Out About Volcanoes?      Walker                      1981 
232  Visions of the Universe                   The Cosmos Store            1981 
233  Catastrophes! [14]                        Fawcett Crest               1981 
234  Isaac Asimov Presents the Best Science
      Fiction of the 19th Century [16]         Beaufort Books              1981 
235  The Seven Cardinal Virtues of Science
      Fiction [16]                             Fawcett Crest               1981 
236  Fantastic Creatures: An Anthology of
     Fantasy and Science Fiction [14]          Franklin Watts              1981 
237  The Sun Shines Bright                     Doubleday                   1981 
238  Change!: Seventy-one Glimpses of the
      Future                                   Houghton Mifflin            1981 
239  Raintree Reading Series I [14]            Raintree                    1981 
       Travels Through Time
       Thinking Machines
       Wild Inventions
       After The End
240  A Grossery of Limericks [12]              W. W. Norton                1981 
241  Miniature Mysteries: One Hundred
      Malicious Little Mystery Stories [11]    Taplinger                   1981 
242  The Twelve Crimes of Christmas [17]       Avon                        1981 
243  Isaac Asimov Presents the Great SF
      Stories, 6: 1944 [13]                    DAW Books                   1981 
244  Space Mail II [14]                        Fawcett Crest               1982 
245  Tantalizing Locked Room Mysteries [16]    Walker                      1982 
246  TV: 2000 [16]                             Fawcett Crest               1982 
247  Laughing Space [18]                       Houghton Mifflin            1982 
248  How Did We Find Out About Life In the
      Deep Sea                                 Walker                      1982 
249  The Complete Robot                        Doubleday                   1982 
250  Speculations [15]                         Houghton Mifflin            1982 
251  Flying Saucers [14]                       Fawcett Crest               1982 
252  Exploring the Earth and the Cosmos        Crown                       1982 
253  Raintree Reading Series II [14]           Raintree                    1982 
       Earth Invaded
       Mad Scientists
       Mutants
       Tomorrow's TV
254  How Did We Find Out About the Beginning
      of Life?                                 Walker                      1982 
255  Dragon Tales [14]                         Fawcett Crest               1982 
256  The Big Apple Mysteries [17]              Avon                        1982 
257  Asimov's Biographical Encyclopedia of
      Science and Technology, 2nd Rev. Ed.     Doubleday                   1982 
258  Isaac Asimov Presents the Great SF
      Stories, 7: 1945 [13]                    DAW Books                   1982 
259  Isaac Asimov Presents Superquiz [19]      Dembner Books               1982 
260  The Last Man on Earth [14]                Fawcett Crest               1982 
261  Science Fiction A to Z: A Dictionary of
      Great Science Fiction Themes [14]        Houghton Mifflin            1982 
262  Foundation's Edge                         Doubleday                   1982 
263  Isaac Asimov Presents the Best Fantasy of
      the 19th Century [16]                    Beaufort Books              1982 
264  Isaac Asimov Presents the Great SF
      Stories, 8: 1946 [13]                    DAW Books                   1982 
265  How Did We Find Out About the Universe?   Walker                      1982 
266  Counting the Eons                         Doubleday                   1983 
267  The Winds of Change and Other Stories     Doubleday                   1983 
268  Isaac Asimov Presents the Great SF
      Stories, 9: 1947 [13]                    DAW Books                   1983 
269  Show Business Is Murder [17]              Avon                        1983 
270  Hallucination Orbit: Psychology In
      Science Fiction [16]                     Farrar, Straus, & Giroux    1983 
271  Caught In the Organ Draft: Biology In
      Science Fiction [16]                     Farrar, Straus, & Giroux    1983 
272  The Roving Mind                           Prometheus Books            1983 
273  The Science Fiction Weight-Loss Book [20] Crown                       1983 
274  The Measure of the Universe               Harper & Row                1983 
275  Isaac Asimov Presents the Best Horror and
      Supernatural Stories of the 19th
      Century [16]                             Beaufort Books              1983 
276  Starships: Stories Beyond the Boundaries
      of the Universe [14]                     Fawcett Crest               1983 
277  The Union Club Mysteries                  Doubleday                   1983 
278  Norby, the Mixed-up Robot [21]            Walker                      1983 
279  Isaac Asimov Presents the Great SF
      Stories, 10: 1948 [13]                   DAW Books                   1983 
280  How Did We Find Out About Genes?          Walker                      1983 
281  The Robots of Dawn                        Doubleday                   1983 
282  Thirteen Horrors of Halloween [17]        Avon                        1983 
283  Creations: The Quest For Origins in
      Story and Science [22]                   Crown                       1983 
284  Isaac Asimov Presents Superquiz II [19]   Dembner Books               1983 
285  Wizards [14]                              NAL                         1983 
286  Those Amazing Electronic Thinking
      Machines!: An Anthology of Robot and
      Computer Stories [14]                    Franklin Watts              1983 
287  Computer Crimes and Capers [14]           Academy Chicago Pub.        1983 
288  Intergalactic Empires [14]                NAL                         1983 
289  Machines That Think: The Best Science
     Stories About Robots and Computers [23]   Holt, Rinehart, & Winston   1983 
290  X Stands for Unknown                      Doubleday                   1984 
291  One Hundred Great Fantasy Short-Short
      Stories [24]                             Doubleday                   1984 
292  Raintree Reading Series 3 [14]            Raintree                    1984 
       Bug Awful
       Children Of The Future
       The Immortals
       Time Warps
293  Isaac Asimov Presents the Great SF
      Stories, 11: 1949 [13]                   DAW Books                   1984 
294  Witches [14]                              NAL                         1984 
295  Murder on the Menu [17]                   Avon                        1984 
296  Young Mutants [14]                        Harper & Row                1984 
297  Isaac Asimov Presents the Best Science
      Fiction Firsts [16]                      Beaufort Books              1984 
298  Norby's Other Secret [21]                 Walker                      1984 
299  How Did We Find Out About Computers?      Walker                      1984 
300  Opus 300                                  Houghton Mifflin            1984 
301  The Science Fictional Olympics [14]       NAL                         1984 
302  Fantastic Reading: Stories & Activities
      for Grade 5-8 [25]                       Scott Foresman & Co.        1984 
303  Banquets of the Black Widowers            Doubleday                   1984 
304  Election Day 2084: Science Fiction
      Stories on the Politics of the
      Future [13]                              Prometheus Books            1984 
305  Isaac Asimov's Limericks for Children     Caedmon                     1984 
306  Isaac Asimov Presents the Great SF
      Stories, 12: 1950 [13]                   DAW Books                   1984 
307  Young Extraterrestrials [14]              Harper & Row                1984 
308  Sherlock Holmes Through Time and
      Space [14]                               Bluejay Books               1984 
309  Asimov's New Guide to Science             Basic Books                 1984 
310  Supermen [14]                             NAL                         1984 
311  Baker's Dozen: 13 Short Fantasy
      Novels [14]                              Crown                       1984 
312  How Did We Find Out About Robots?         Walker                      1984 
313  Asimov's Guide to Halley's Comet          Walker                      1985 
314  Cosmic Knights [14]                       NAL                         1985 
315  The Hugo Winners, Volume IV [6]           Doubleday                   1985 
316  Young Monsters [14]                       Harper & Row                1985 
317  The Exploding Suns: The Secrets of the
      Supernovas                               E. P. Dutton                1985 
318  Norby and the Lost Princess [21]          Walker                      1985 
319  Spells [14]                               NAL                         1985 
320  How Did We Find Out About the Atmosphere? Walker                      1985 
321  Living in the Future [26]                 Harmony House               1985 
322  Robots, Machines In Man's Image [27]      Harmony House               1985 
323  The Edge of Tomorrow                      Tor/Tom Doherty Associates  1985 
324  Great Science Fiction Stories by the
      World's Great Scientists [14]            Donald I. Fine              1985 
325  Isaac Asimov Presents the Great SF
      Stories, 13: 1951 [13]                   DAW Books                   1985 
326  The Subatomic Monster                     Doubleday                   1985 
327  The Disappearing Man and Other Mysteries  Walker                      1985 
328  Robots and Empire                         Doubleday                   1985 
329  Amazing Stories: Sixty Years of the Best
      Science Fiction [13]                     TSR Inc.                    1985 
330  Young Ghosts [14]                         Harper & Row                1985 
331  Baker's Dozen: Thirteen Short Science
      Fiction Novels [14]                      Crown                       1985 
332  It's Such a Beautiful Day                 Creative Education          1985 
333  Norby and the Invaders [21]               Walker                      1985 
334  Giants [14]                               NAL                         1985 
335  How Did We Find Out About DNA?            Walker                      1985 
336  The Alternate Asimovs                     Doubleday                   1986 
337  Isaac Asimov Presents the Great SF
      Stories, 14: 1952 [13]                   DAW Books                   1986 
338  Comets [14]                               NAL                         1986 
339  Young Star Travelers [14]                 Harper & Row                1986 
340  The Hugo Winners, Volume V [6]            Doubleday                   1986 
341  The Dangers of Intelligence and Other
      Science Essays                           Houghton Mifflin            1986 
342  Mythical Beasties [14]                    NAL                         1986 
343  How Did We Find Out About the Speed of
      Light?                                   Walker                      1986 
344  Futuredays: A Nineteenth-Century Vision
      of the Year 2000                         Henry Holt                  1986 
345  Science Fiction by Asimov                 Davis Publications          1986 
346  Tin Stars [14]                            NAL                         1986 
347  The Best Science Fiction of Isaac Asimov  Doubleday                   1986 
348  The Best Mysteries of Isaac Asimov        Doubleday                   1986 
349  Foundation and Earth                      Doubleday                   1986 
350  Robot Dreams                              Byron Preiss                1986 
351  Norby and the Queen's Necklace [21]       Walker                      1986 
352  Magical Wishes [14]                       NAL                         1986 
353  Isaac Asimov Presents the Great SF
      Stories, 15: 1953 [13]                   DAW Books                   1986 
354  Far as Human Eye Could See                Doubleday                   1987 
355  The Twelve Frights of Christmas [16]      Avon                        1986 
356  How Did We Find Out About Blood?          Walker                      1987 
357  Past, Present, and Future                 Prometheus Books            1987 
358  Isaac Asimov Presents Superquiz III [19]  Dembner Books               1987 
359  How Did We Find Out About Sunshine?       Walker                      1987 
360  Isaac Asimov Presents the Great SF
      Stories, 16: 1954 [13]                   DAW Books                   1987 
361  Young Witches and Warlocks [14]           Harper & Row                1987 
362  How to Enjoy Writing: A Book of Aid and
      Comfort [21]                             Walker                      1987 
363  Devils [14]                               NAL                         1987 
364  Norby Finds a Villain [21]                Walker                      1987 
365  Fantastic Voyage II: Destination Brain    Doubleday                   1987 
366  Hound Dunnit [28]                         Carroll & Graf              1987 
367  Space Shuttles [14]                       NAL                         1987 
368  How Did We Find Out About the Brain?      Walker                      1987 
369  Did Comets Kill the Dinosaurs?            Gareth Stevens, Inc         1987 
370  Beginnings: The Story of Origins -
      of Mankind, Life, the Earth, the
      Universe                                 Walker                      1987 
371  Atlantis [14]                             NAL                         1988 
372  Isaac Asimov Presents the Great SF
      Stories, 17: 1955 [13]                   DAW Books                   1988 
373  Asimov's Annotated Gilbert and Sullivan   Doubleday                   1988 
374  Isaac Asimov Presents From Harding to
      Hiroshima [34]                           Dembner Books               1988 
375  How Did We Find Out About
      Superconductivity?                       Walker                      1988 
376  Other Worlds of Isaac Asimov              Avenel                      1987 
377  Isaac Asimov's Book of Science and Nature
      Quotations [29]                          Blue Cliff                  1988 
378  The Relativity of Wrong                   Doubleday                   1988 
379  Prelude to Foundation                     Doubleday                   1988 
380  Encounters [14]                           Headline                    1988 
381  The Asteroids                             Gareth Stevens, Inc         1988 
382  The Earth's Moon                          Gareth Stevens, Inc         1988 
383  Mars: Our Mysterious Neighbor             Gareth Stevens, Inc         1988 
384  Our Milky Way and Other Galaxies          Gareth Stevens, Inc         1988 
385  Quasars, Pulsars, and Black Holes         Gareth Stevens, Inc         1988 
386  Rockets, Probes, and Satellites           Gareth Stevens, Inc         1988 
387  Our Solar System                          Gareth Stevens, Inc         1988 
388  The Sun                                   Gareth Stevens, Inc         1988 
389  Uranus: The Sideways Planet               Gareth Stevens, Inc         1988 
390* History of Biology (a chart)              Carolina Biological Suppls. 1988 
391  Isaac Asimov Presents the Best Crime
      Stories of the 19th Century [16]         Dembner Books               1988 
392  The Mammoth Book of Classic Science
      Fiction: Short Novels of the 1930's      Carroll & Graf              1988 
393  Monsters [16]                             NAL                         1988 
394  Isaac Asimov Presents the Great SF
      Stories, 18: 1956 [13]                   DAW Books                   1988 
395  Azazel                                    Doubleday                   1988 
396* Isaac Asimov's Science Fiction and
      Fantasy Story-a-Month 1989 Calendar      Pomegranate Calendars & Bks 1988 
397  Ghosts [14]                               NAL                         1988 
398  Saturn: The Ringed Beauty                 Gareth Stevens, Inc         1988 
399  How Was the Universe Born?                Gareth Stevens, Inc         1988 
400  Earth: Our Home Base                      Gareth Stevens, Inc         1988 
401  Ancient Astronomy                         Gareth Stevens, Inc         1988 
402  Unidentified Flying Objects               Gareth Stevens, Inc         1988 
403  Space Spotter's Guide                     Gareth Stevens, Inc         1988 
404  Norby Down to Earth                       Walker                      1988 
405  The Sport of Crime [17]                   Lynx                        1988 
406  How Did We Find Out About Microwaves?     Walker                      1989 
407  Isaac Asimov Presents the Great SF
      Stories, 19: 1957 [13]                   DAW Books                   1989 
408  Asimov's Galaxy: Reflections on Science
      Fiction                                  Doubleday                   1989 
409  All the Troubles of the World             Creative Education          1989 
410  Franchise                                 Creative Education          1989 
411  Robbie                                    Creative Education          1989 
412  Sally                                     Creative Education          1989 
413* Isaac Asimov Presents Tales of the
      Occult [14]                              Prometheus Books            1989 
414  Purr-fect Crime [17]                      Lynx                        1989 
415  Is There Life On Other Planets?           Gareth Stevens, Inc         1989 
416  Science Fiction, Science Fact             Gareth Stevens, Inc         1989 
417  Mercury: The Quick Planet                 Gareth Stevens, Inc         1989 
418  Space Garbage                             Gareth Stevens, Inc         1989 
419  Jupiter: The Spotted Giant                Gareth Stevens, Inc         1989 
420  The Birth and Death of Stars              Gareth Stevens, Inc         1989 
421  The Asimov Chronicles: Fifty Years of
      Isaac Asimov                             Dark Harvest                1989 
422  Robots [14]                               NAL                         1989 
423  History of Mathematics (a chart)          Carolina Biological Suppls. 1989 
424  Think About Space: Where Have We Been
      and Where Are We Going? [30]             Walker                      1989 
425  Isaac Asimov Presents Superquiz IV        Dembner Books               1989 
426  The Tyrannosaurus Prescription: and One 
      Hundred Other Science Essays             Prometheus Books            1989 
427  Asimov On Science: A 30 Year 
      Retrospective 1959-1989                  Doubleday                   1989 
428  Visions of Fantasy: Tales From the
      Masters [13]                             Doubleday                   1989 
429  Nemesis                                   Doubleday                   1989 
430  Curses [14]                               NAL                         1989 
431  Asimov's Chronology of Science and
      Discovery                                Harper & Row                1989 
432  How Did We Find Out About
      Photosynthesis?                          Walker                      1989 
433  The Complete Science Fair Handbook [31]   Scott Foresman & Co         1989 
434  Little Treasury of Dinosaurs (5 book set) Outlet                      1989 
       Giant Dinosaurs (vol. 1)
       Armored Dinosaurs (vol. 2)
       Small Dinosaurs (vol. 3)
       Sea Reptiles and Flying Reptiles
       (vol. 4)
       Meat-Eating Dinosaurs and Horned
       Dinosaurs (vol. 5)
435  The New Hugo Winners [13]                 Wynwood Press               1989 
436  Senior Sleuths: A Large Print Anthology
      of Mysteries and Puzzlers [28]           G. K. Hall & Co.            1989 
437  Norby and Yobo's Great Adventure [21]     Walker                      1989 
438  Mythology and the Universe                Gareth Stevens, Inc         1989 
439  Colonizing the Planets and the Stars      Gareth Stevens, Inc         1989 
440  Astronomy Today                           Gareth Stevens, Inc         1989 
441  Pluto: A Double Planet?                   Gareth Stevens, Inc         1989 
442  Piloted Space Flights                     Gareth Stevens, Inc         1989 
443  Comets and Meteors                        Gareth Stevens, Inc         1989 
444  Puzzles of the Black Widowers             Doubleday                   1990 
445  Norby and the Oldest Dragon [21]          Walker                      1990 
446  Cosmic Critiques: How & Why Ten Science
      Fiction Stories Work [13]                Writer's Digest Books       1990 
447  Frontiers: new discoveries about man and
      his planet, outer space and the
      universe                                 E. P. Dutton/Truman         1990 
448  Isaac Asimov Presents the Great SF
      Stories, 20: 1958 [13]                   DAW Books                   1990 
449  Out of the Everywhere                     Doubleday                   1990 
450  Robot Visions                             Byron Preiss                1990 
451  How Did We Find Out About Lasers?         Walker                      1990 
452  Neptune: The Farthest Giant               Gareth Stevens, Inc         1990 
453  Venus: A Shrouded Mystery                 Gareth Stevens, Inc         1990 
454  The World's Space Programs                Gareth Stevens, Inc         1990 
455  Isaac Asimov Presents the Great SF
      Stories, 21: 1959 [13]                   DAW Books                   1990 
456  Nightfall [32]                            Doubleday                   1990 
457  Robots from Asimov's                      Davis Publications          1990 
458  Invasions [14]                            Roc/Penguin Books           1990 
459  The Mammoth Book of Vintage Science
      Fiction: Short Novels of the 1950s [16]  Carroll & Graf              1990 
460  The Complete Stories Volume 1             Doubleday                   1990 
461  The March of the Millennia: A Key To
      Looking At History [30]                  Walker                      1991 
462  The Secret of the Universe                Doubleday                   1991 
463  How Did We Find Out About Neptune?        Walker                      1990 
464  How Did We Find Out About Pluto?          Walker                      1991 
465  Isaac Asimov Presents the Great SF
      Stories, 22: 1960 [13]                   DAW Books                   1991 
466  Atom: Journey Across the Subatomic
      Cosmos                                   E. P. Dutton/Truman         1991 
467  Cal                                       [35]                        1991 
468  Isaac Asimov Presents the Great SF
      Stories, 23: 1961 [13]                   DAW Books                   1991 

     The Mammoth Book of Golden Age Science
      Fiction: Short Novels of the
      1940's [16]                              Carroll & Graf              1989 
     Christopher Columbus: Navigator to the
      New World                                Gareth Stevens, Inc         1991 
     What is a Shooting Star?                  Gareth Stevens, Inc         1991 
     Why Do Stars Twinkle?                     Gareth Stevens, Inc         1991 
     Why Does the Moon Change Shape?           Gareth Stevens, Inc         1991 
     Why Do We Have Different Seasons?         Gareth Stevens, Inc         1991 
     What is an Eclipse?                       Gareth Stevens, Inc         1991 
     Faeries [14]                              Roc/Penguin Books           1991 
     The Mammoth Book of New World Science
      Fiction: Short Novels of the 1960s [16]  Carroll & Graf              1991 
     Isaac Asimov's Guide to Earth and Space   Random House                1991 
     Ferdinand Magellan: Opening the Door to
      World Exploration                        Gareth Stevens, Inc         1991 
     Norby and the Court Jester [21]           Walker                      1991 
     Our Angry Earth: a Ticking Time Bomb [33] Tor                         1991 
     Meriwether Lewis and William Clark        Gareth Stevens, Inc         1991 
     Why Are Whales Vanishing?                 Gareth Stevens, Inc         1991 
     Is Our Planet Warming Up?                 Gareth Stevens, Inc         1991 
     Why Is the Air Dirty?                     Gareth Stevens, Inc         1991 
     Where Does Garbage Go?                    Gareth Stevens, Inc         1991 
     What Causes Acid Rain?                    Gareth Stevens, Inc         1991 
     Asimov's Chronology of the World          HarperCollins               1991 
     Asimov Laughs Again: More Than 700
      Favorite Jokes, Limericks, and
      Anecdotes                                HarperCollins               1992 
     Isaac Asimov Presents the Great SF
      Stories, 24: 1962 [13]                   DAW Books                   1992 
     The New Hugo Winners, Volume II [6]       Baen Books                  1992 
     The Complete Stories Volume 2             Doubleday                   1992 
     Why Are Some Beaches Oily?                Gareth Stevens, Inc         1992 
     Why Are Animals Endangered?               Gareth Stevens, Inc         1992 
     What's Happening to the Ozone Layer?      Gareth Stevens, Inc         1993 
     Why Are the Rain Forests Vanishing?       Gareth Stevens, Inc         1992 
     Why Does Litter Cause Problems?           Gareth Stevens, Inc         1992 
     Isaac Asimov Presents the Great SF
      Stories, 25: 1963 [13]                   DAW Books                   1992 
     The Mammoth Book of Fantastic Science
      Fiction: Short Novels of the 1970s [16]  Carroll & Graf              1992 
     The Ugly Little Boy
      (Child of Time [U.K.]) [32]              Doubleday                   1992 
     Forward the Foundation                    Doubleday                   1993 
     The Positronic Man [32]                   Doubleday                   1993 
     The Mammoth Book of Modern Science
      Fiction: Short Novels of the 1980s [16]  Carroll & Graf              1993 
     Frontiers II: more recent discoveries
      about life, Earth, space, and the
      universe [21]                            E. P. Dutton/Truman         1993 
     The Future in Space                       Gareth Stevens, Inc         1993 
     I. Asimov: A memoir                       Doubleday                   1994 
     Gold                                      HarperPrism                 1995 
     Yours, Isaac Asimov [36]                  Doubleday                   1995
     Magic                                     HarperPrism                 1996 

Notes:

[1] Published by Doubleday since 1961

[2] Originally published under the pseudonym Paul French

[3] With Burnham S. Walker and William C. Boyd

[4] With William C. Boyd

[5] With Burnham S. Walker and M. K. Nicholas

[6] Anthology edited by Isaac Asimov

[7] Anthology edited by Isaac Asimov and Groff Conklin

[8] Natural History Press is a division of Doubleday

[9] With Stephen H. Dole

[10] With Theodosius Dobzhansky

[11] Anthology edited by Isaac Asimov, Martin H. Greenberg, & Joseph D. Olander

[12] With John Ciardi

[13] Anthology edited by Isaac Asimov & Martin H. Greenberg

[14] Anthology edited by Isaac Asimov, Martin H. Greenberg, & Charles G. Waugh

[15] Anthology edited by Isaac Asimov & Alice Laurance

[16] Anthology edited by Isaac Asimov, Charles G. Waugh, & Martin H. Greenberg

[17] Anthology edited by Isaac Asimov, Carol-Lynn Rossel Waugh, & Martin H. Greenberg

[18] Anthology edited by Isaac Asimov & J. O. Jeppson

[19] By Ken Fisher

[20] Anthology edited by Isaac Asimov, George R. R. Martin, & Martin H. Greenberg

[21] With Janet Asimov

[22] Anthology edited by Isaac Asimov, George Zebrowski, & Martin H. Greenberg

[23] Anthology edited by Isaac Asimov, Patricia S. Warrick, & Martin H. Greenberg

[24] Anthology edited by Isaac Asimov, Terry Carr, & Martin H. Greenberg

[25] Anthology edited by Isaac Asimov, Martin H. Greenberg, & David C. Yeager

[26] Edited by Isaac Asimov

[27] With Karen Frenkel

[28] Anthology edited by Isaac Asimov, Martin H. Greenberg, & Carol-Lynn Rossel Waugh

[29] With Jason A. Shulman

[30] With Frank White

[31] With Anthony D. Fredericks

[32] With Robert Silverberg

[33] With Frederik Pohl

[34] By Barrington Boardman

[35] A short story written exclusively for subscribers to the Isaac Asimov Collection – Not for sale in bookstores

[36] Edited by Stan Asimov

To the Isaac Asimov home page
To the Isaac Asimov FAQ
To Jenkins’ Spoiler-Laden Guide to Isaac Asimov

Author:

Edward Seiler
ejseiler@earthlink.net


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/mROETQXtsXc/asimov_titles.html

Original article

Android N-ify Xposed Module Will Bring Android N Features to Older Phones

Android (root , 5.0+): Unfortunately, the Android N preview is only available for a few devices . However, you can try out a few of the newest features on Android Lollipop and up with this Xposed module.

Read more…



Original URL: http://feeds.gawker.com/~r/lifehacker/full/~3/RSv8feQOTNY/android-n-ify-xposed-module-will-bring-android-n-featur-1767470292

Original article

GitHub forces Gitbucket to change their UI

We received a mail from GitHub a few weeks ago.

It was saying GitHub does not allow to clone GitHub user interface and copy their proprietary materials. It infringes their exclusive intellectual property rights and also causes confusion for GitHub users. They requested eliminating GitHub’s property materials and similarity from GitBucket.

At first, we have never copied any materials from GitHub so this is not problem. However we should remove similarity to GitHub. Therefore we proposed changing color scheme to show difference clearly as a quick way.

Fortunately, GitHub response was peaceful. They said that’s good as a first step. But similarity is not only color. It’s made by with icons, with fonts, with the layout, etc. We should remove these similarity and refresh our UI eventually.

As a result, we will release the next version 3.13 with changing to the Bootstrap default theme and simplifying some parts of UI. And continue to make difference with GitHub through future versions.

GitBucket 3.13 with Bootstrap default theme

Anyway, we want to continue to use GitHub for our project because we love it. GitHub has been making revolution in the open-source world. Great thanks for this fantastic software development platform.


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/Qe30HL9oflQ/change-user-interface.html

Original article

Calibre breakthrough: Convert e-books to the new Kindle KFX format with enhanced Typesetting

calibre pictureWant to be able to read sideloaded books in the fancy KFX format with your recent-model Kindle?

Craving for dropped caps, hyphenation, better kerning?

Now, even if a book didn’t come from the Amazon store, you can enjoy the above features thanks to a new KFX Conversion Output plug-in for Calibre. Here’s the lowdown from MobileRead:

“The plugin is activated by selecting KFX as the output format when converting books in calibre. It performs the following steps during conversion:

  • Convert from the original e-book format to EPUB.
  • Use the Amazon Kindle Previewer to convert from EPUB to KDF.
  • Repackage the KDF data into a KFX container.

”Because the first step in conversion is to produce an EPUB and since the Kindle Previewer has no configuration options, the Conversion Output Options tab for KFX output is the same as for EPUB output.”

Please drop by the MobileRead posting from jhowell for more information, relevant links and caveats. Alas, the plugin isn’t for the faint-hearted:

enhancedKindleFormat“Unfortunately, the Amazon Kindle Previewer often fails to convert books and provides no guidance on how to correct the problem when this occurs. Getting a book to convert successfully may require trial and error editing of the source format. If a conversion error occurs the plugin attempts to capture the most relevant error message from temporary log files produced by the Previewer. The error messages produced are cryptic, but better than nothing. View the conversion job log after an error occurs to see the messages produced by the Previewer during conversion.”

And still more: “This plugin has only been tested on the Windows platform. Compatibility with Mac OS has not been tested. It is unknown whether or not the Previewer will function under Linux/Wine.

”This plugin only converts from other e-book formats to KFX. It does not convert from KFX to other formats.”

Oh, the fun of the Tower of eBabel! High time for Amazon to abandon its proprietary formats (and really really lean on publishers to drop encryption-based DRM or at least it with the more benign social DRM). Yes, the KFX plug-in is a break-through. But not for everyone. I love Calibre, but must one really have to use it to keep up fully with the latest format changes at Amazon?

If Amazon wants to add formatting capabilities for typical e-books, then it should work within the International Digital Publishing Forum. Amazon can compete very well, thank you, in a number of areas ranging from price to selection, and the last thing we need is for the company to keep inflicting more complexities on users. Bring on the criticism. I’m sticking to my guns. Amazon, as much is ever, needs to do ePub.

(Via The eBookReader.com and Nate.)

The post Calibre breakthrough: Convert e-books to the new Kindle KFX format with enhanced Typesetting appeared first on TeleRead News: E-books, publishing, tech and beyond.


Original URL: http://www.teleread.com/calibre-breakthrough-convert-e-books-new-kindle-kfx-format-enhanced-typesetting/

Original article

On pastrami and the business of PLOS

Last week my friend Andy Kern (a population geneticist at Rutgers) went on a bit of a bender on Twitter prompted by his discovery of PLOS’s IRS Form 990 – the annual required financial filing of non-profit corporations in the United States. You can read his string of tweets and my responses, but the gist of his critique is this: PLOS pays its executives too much, and has an obscene amount of money in the bank.

Let me start by saying that I understand where his disdain comes from. Back when we were starting PLOS we began digging into the finances of the scientific societies that were fighting open access, and I was shocked to see how much money they were sitting on and how much their CEOs get paid. If I weren’t involved with PLOS, and I’d stumbled upon PLOS’s Form 990 now, I’d have probably raised a storm about it. I have absolutely no complaints about Andy’s efforts to understand what he was seeing – non-profits are required to release this kind of financial information precisely so that people can scrutinize what they are doing. And I understand why Andy and others find some of the info discomforting, and share some of his concerns. But having spent the last 15 years trying to build PLOS and turn it into a stable enterprise, I have a different perspective, and I’d like to explain it.

Let me start with something on which I agree completely with Andy completely, science publishing is way too expensive. Andy says he originally started poking into PLOS’s finances because he wanted to know where the $2,250 he was asked to pay to publish in PLOS Genetics went to, as this seemed like a lot of money to take a paper, have a volunteer academic serve as editor, find several additional volunteers to serve as peer reviewers, and then, if they accept the paper, turn it into a PDF and HTML version and publish it online. And he’s right. It is too much money.

That $2,250 is only about a third of the $6,000 a typical subscription journal takes in for every paper they publish, and that $6,000 buys access for only a tiny fraction of the world’s population, while the $2,250 buys it for everyone. But $2,250 is still too much, as is the $1,495 at PLOS ONE. I’ve always said that our goal should be to make it cost as little as possible to publish, and that our starting point should be $0 a paper.

The reality is, however, that it costs PLOS a lot more than $0 to handle a paper. We handle a lot of papers – close to 200 a day – each one different.  There’s a lot of manual labor involved in making sure the submission is complete, that it passes ethical and technical checks, in finding an editor and reviewers and getting them to handle the paper in a timely and effective manner. It then costs money to turn the collection of text and figures and tables into a paper, and to publish it and maintain a series of high-volume websites. All together we have a staff of well over 100 people running our journal operations, and they need to have office space, people to manage them, an HR system, an accounting system and so on – all the things a business has to have. And for better or worse our office is in San Francisco (remember that two of the three founders were in the Bay Area, and we couldn’t have started it anywhere else), which is a very expensive place to operate. We have always aimed to keep our article processing charges (APCs) as low as possible – it pains me every time we’ve had to raise our charges, since I think we should be working to eliminate APCs, not increase them. But we have to be realistic about what publishing costs us.

The difference in price between our journals reflects different costs. PLOS Biology and PLOS Medicine have professional editors handling each manuscript, so they’re intrinsically more expensive to operate. They also have relatively low acceptance rates, meaning a lot of staff time is spent on rejected papers, which generate no revenue. This is also the reason for the difference in price between our community journals like PLOS Genetics and PLOS ONE: the community journals reject more papers and thus we have to charge more per accepted paper. It might seem absurd to have people pay to reject other people’s papers, but if you think about it, that’s exactly what makes selective journals attractive – they have to publish your paper and reject lots of others. I’ve argued for a long time that we should do away with selective journals, but so long as people want to publish in them, they’re going to have this weird economics. And note this is not just true of open access journals – higher impact subscription journals bring in a lot more money per published paper than low impact subscription journals, for essentially the same reason.

Could PLOS do all these things more efficiently, more effectively and for less money? Absolutely. We, like most other big publishers, are using legacy software and systems to handle submissions, manage peer review and convert manuscripts into published papers. These systems are, for the most part, expensive, outdated and difficult or expensive (usually both) to customize. We are in a challenging situation since, until very recently, we weren’t in a position to develop our own systems for doing all these things, and we couldn’t just switch to cheaper or free system since they weren’t built to handle the volume of papers we deal with.

That said, it’s certainly possible to run journals much, much more cheaply. It costs the physics pre-print arXiv something like $10 a paper to maintain its software, screening and website. There are times when I wish PLOS had just hacked together a bunch of Perl scripts and hung out a shingle and built in new features as we needed them. But part of what made PLOS appealing at the start is that it didn’t work that way – for better or worse it looked like a real journal, and this was one of the things that made people comfortable with our (at the time) weird economic model. I’m not sure this is true anymore, and if I were starting PLOS today I would do things differently, and think I could do things much less expensively. I would love it if people would set up inexpensive or even free open access biology journals – it’s certainly possible with open source software and fully volunteer labor – and for people to get comfortable with biomedical publishing basically being no different than just posting work on the Internet, with lightweight systems for peer review. That has always seemed to me to be the right way to do things. But PLOS can’t just pull the plug on all the things we do, so we’re trying to achieve the same goal by investing in developing software that will make it possible to do all of the things PLOS does faster, better and cheaper. We’re going to start rolling it out this year, and, while I don’t run PLOS and can’t speak for the whole board, I am confident that this will bring our costs down significantly and that we will ultimately be in a position to reduce prices.

Which brings us to issue number two. Andy and a lot of other people took umbrage at the fact that PLOS has margins of 20% and has ~$25 million dollars in assets. Again, I understand why people look at these numbers and find them shocking – anything involving millions of dollars always seems like a lot of money. But this is a misconception. Both of these numbers represent nothing more than what is required for PLOS to be a stable enterprise.

I’ll start by reminding people that PLOS is still a relatively young company, working in a rapidly changing industry. Like most startups, it took a long time for PLOS to break even. For the first nine years of our existence we lost money every year, and were able to build our business only because we got strong support from foundations that believed in what we were doing. Finally, in 2011, we reached the point where we were taking in slightly more money than we were spending, allowing us to wean ourselves of foundation support. But we still had essentially no money in the bank, and that’s not a good thing. Good operating practices for any business dictate that the company have money in the bank to cover a downturn in revenue. This is particularly the case with open access publishers, since we have no guaranteed revenue stream – in contrast to subscription publishers who make long-term subscription deals. What’s more, this industry is changing rapidly, with the number of papers going to open access journals growing, but many new open access publishers entering the market. So it’s very hard for us to predict what our business is going to look like from year to year, while a lot of our expenses, like rent, software licenses and salaries, have to be paid before revenue they enable comes in. The only way to survive in this market is to have a decent amount of money in the bank to buffer against the unpredictable. If anything, I am told by people who spend their lives thinking about these things, we’re cutting things a little close. So, while 20% margins may seem like a lot, given our overall financial situation and the fact that we’ve been profitable for only five years, I think it’s actually a reasonable compromise between keeping costs as low as we can and ensuring that PLOS remains financially stable while also allowing us to make modest investments in technology that will make publishing better and cheaper in the long run.

Just to put these numbers in perspective for people who (like me) aren’t trained to think about these things, I had a look at the finances of a large set of scientific societies. I looked primarily at the members of FASEB, a federation of most of the major societies in molecular biology. Many of them have larger operating margins, and far larger cash reserves than PLOS. And I haven’t found one yet that doesn’t have a larger ratio of assets to expenses than PLOS does. And these are all organizations that have far more stable revenue streams than PLOS does. So I just don’t think it’s fair to suggest that either PLOS’s margins or reserves are untoward.

Indeed these numbers represent something important – that PLOS has become a successful business. I’ll once again remind people that one of the major knocks against open access when PLOS started was that we were a bunch of naive idealists (that’s the nicest way people put it) who didn’t understand what it took to run a successful business. Commercial publishers and societies alike argued repeatedly to scientists, funders and legislators that the only way to make money in science publishing was to use a subscription model. So it was absolutely critical to the success of the open access movement that PLOS not only succeed as a publisher, but that we also succeed as a business – to show the commercial and society publishers that their principal argument for why they refused to shift to open access was wrong. Having been the recipient of withering criticism – both personally and and as organization – about being too financially naive, it’s ironic and a bit mind boggling to all of a sudden be criticized for having created too good of a business.

Now despite that, I don’t want people to confuse my defense of PLOS’s business success with a defense of the business it’s engaged in. While I believe the APC/service business model PLOS has helped to develop is far far superior to the traditional subscription model, because it does not require paywalls, but I’ve never been comfortable with the APC business model in an absolute sense (and I recognize the irony of my saying that) because I wish science publishing weren’t a business at all. When we started PLOS the only way we had to make money was through APCs, but if I had my druthers we’d all just post papers online in a centralized server funded and run by a coalition of governments and funders, and scientists would use lightweight software to peer review published papers and organize the literature in useful ways. And no money would be exchanged in the process. I’m glad that PLOS is stable and has shown the world that the APC model can work, but I hope that we can soon move beyond it to a very different system.

Now I want to end on the issue that seemed to upset people the most – which is the salaries of PLOS’s executives. I am immensely proud of the executive team at PLOS – they are talented and dedicated. They make competitive salaries – and we’d have trouble hiring and retaining them if they didn’t. The board has been doing what we felt we had to do to build a successful company in the marketplace we live in – after all, we were founded to fix science publishing, not capitalism. But as an individual I can’t help but feel that’s a copout. The truth is the general criticism is right. A system where executives make so much more money that the staff they supervise isn’t just unfair, it’s ultimately corrosive. It’s something we all have to work to change, and I wish I’d done more to help make PLOS a model of this.

Finally, I want to acknowledge a tension evident in a lot of the discussion around this issue. Some of the criticism of PLOS – especially about margins and cash flow – have been just generally unfair. But others – about salaries and transparency – reflect something important. I think people understand that in these ways PLOS is just being a typical company. But we weren’t founded to just be a typical company – we were founded to be different and, yes, better, and people have higher expectations of us than they do a typical company. I want it to be that way. But PLOS was also not founded to fail – that would have been terrible for the push for openness in science publishing.I am immensely proud of PLOS’s success as a publisher, agent for change, and a business – and of all the people inside and outside of the organization who helped achieve it. Throughout PLOS’s history there were times we had to choose between abstract ideals and the reality of making PLOS a successful business, and I think, overall, we’ve done a good, but far from perfect, job of balancing this tension. And moving forward I personally pledge to do a better job of figuring out how to be successful while fully living up to those ideals.


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/vJM5sgwICr8/

Original article

New Kid on the Blockchain

All of this work is still very early. The first full public version of the Ethereum software was recently released, and the system could face some of the same technical and legal problems that have tarnished Bitcoin.

Many Bitcoin advocates say Ethereum will face more security problems than Bitcoin because of the greater complexity of the software. Thus far, Ethereum has faced much less testing, and many fewer attacks, than Bitcoin. The novel design of Ethereum may also invite intense scrutiny by authorities given that potentially fraudulent contracts, like the Ponzi schemes, can be written directly into the Ethereum system.

But the sophisticated capabilities of the system have made it fascinating to some executives in corporate America. IBM said last year that it was experimenting with Ethereum as a way to control real world objects in the so-called Internet of things.

Photo
ConsenSys employees in Brooklyn. The company has hired more than 50 developers.

Credit
Cole Wilson for The New York Times

Microsoft has been working on several projects that make it easier to use Ethereum on its computing cloud, Azure.

“Ethereum is a general platform where you can solve problems in many industries using a fairly elegant solution — the most elegant solution we have seen to date,” said Marley Gray, a director of business development and strategy at Microsoft.

Mr. Gray is responsible for Microsoft’s work with blockchains, the database concept that Bitcoin introduced. Blockchains are designed to store transactions and data without requiring any central authority or repository.

Blockchain ledgers are generally maintained and updated by networks of computers working together — somewhat similar to the way that Wikipedia is updated and maintained by all its users.

Many corporations, though, have created their own Ethereum networks with private blockchains, independent of the public system, and that could ultimately detract from the value of the individual unit in the Ethereum system — known as an Ether — that people have recently been buying.

The interest in Ethereum is one sign of the corporate fascination with blockchains. Most major banks have expressed an interest in using them to make trading and money transfer faster and more efficient. On Tuesday, executives from the largest banks will gather for a conference, “Blockchain: Tapping Into the Real Potential, Cutting Through the Hype.”

Many of these banks have recently been looking at how some version of Ethereum might be put to use. JPMorgan, for instance, has created a specific tool, Masala, that allows some of its internal databases to interact with an Ethereum blockchain.

Michael Novogratz, a former top executive at the private equity firm Fortress Investing Group, who helped lead Fortress’s investment in Bitcoin, has been looking at Ethereum since he left Fortress last fall. Mr. Novogratz said that he made a “significant” purchase of Ether in January. He has also heard how the financial industry’s chatter about the virtual currency has evolved.

“A lot of the more established players were thinking, ‘It’s still an experiment,’ ” he said. “It feels like in the last two to three months that experiment is at least getting a lot more validation.”

Since the beginning of the year, the value of an individual unit of Ether has soared as high as $12 from around $1. That has brought the value of all existing Ether to over $1 billion at times, significantly more than any virtual currency other than Bitcoin, which had over $6 billion in value outstanding last week.

Since Bitcoin was invented, there have been many so-called alt-coins that have tried to improve on Bitcoin, but none have won the following of Ethereum.

Photo
Ethereum, a rival currency to Bitcoin, has soared in value recently.

Credit
Cole Wilson for The New York Times

Unlike Bitcoin, which was released in 2009 by a mysterious creator known as Satoshi Nakamoto, Ethereum was created in a more transparent fashion by a 21-year-old Russian-Canadian, Vitalik Buterin, after he dropped out of Waterloo University in Ontario.

The most basic aim of Ethereum was to make it possible to program binding agreements into the blockchain — the smart contract concept. Two people, for instance, could program a bet on a sports game directly into the Ethereum blockchain. Once the final score came in from a mutually agreed upon source — say, The Associated Press — the money would be automatically transferred to the winning party. Ether can be used as a currency in this system, but Ether are also necessary to pay for the network power needed to process the bet.

The Ethereum system has sometimes been described as a single shared computer that is run by the network of users and on which resources are parceled out and paid for by Ether.

A team of seven co-founders helped Mr. Buterin write up the software after he released the initial description of the system. Mr. Buterin’s team raised $18 million in 2014 through a presale of Ether, which helped fund the Ethereum Foundation, which supports the software’s development.

Like Bitcoin, Ethereum has succeeded by attracting a dedicated network of followers who have helped support the software, partly in the hope that their Ether will increase in value if the system succeeds. Last week, there were 5,800 computers — or nodes — helping support the network around the world. The Bitcoin network had about 7,400 nodes.

One of Mr. Buterin’s co-founders, Joseph Lubin, has set up ConsenSys, a company based in Brooklyn that has hired over 50 developers to build applications on the Ethereum system, including one that enables music distribution and another that allows for a new kind of financial auditing.

The ConsenSys offices are in an old industrial building in the Bushwick section of Brooklyn. The office is essentially one large room, with all the messy trademarks of a start-up operation, including white boards on the walls and computer parts lying around.

Mr. Lubin said he had thrown himself into Ethereum after starting to think that it delivered on some of the failed promise of Bitcoin, especially when it came to allowing new kinds of online contracts and markets.

“Bitcoin presented the broad strokes vision, and Ethereum presented the crystallization of how to deliver that vision,” he said.

Joseph Bonneau, a computer science researcher at Stanford who studies so-called crypto-currencies, said Ethereum was the first system that had really caught his interest since Bitcoin.

It is far from a sure thing, he cautioned.

“Bitcoin is still probably the safest bet, but Ethereum is certainly No. 2, and some folks will say it is more likely to be around in 10 years,” Mr. Bonneau said. “It will depend if any real markets develop around it. If there is some actual application.”

Continue reading the main story


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/z9sxxyVoMjA/ethereum-a-virtual-currency-enables-transactions-that-rival-bitcoins.html

Original article

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑

%d bloggers like this: