AngularJS project structure

I’ve been programming professionally for about 6-7 years now and I’ve used my fair share of languages/frameworks/stacks. My most recent job (as well as some of my current freelance work) has led me down the AngularJS/NodeJS path. Do not get me wrong – I love using both of these in my web development work. NodeJS with Express is quite nice for writing basic APIs and AngularJS for the front end allows someone who sucks at front end work to be able to produce some pretty useful interfaces.

Now, obviously there a million-and-one things that developers debate about. For nostalgia’s sake, let’s list a few:

  • Tabs vs. spaces
  • Vim vs. Emacs
  • Language X vs. language Y
  • Whether API X is ‘RESTful’

This list goes on and on.

I try not to get sucked into these sorts of discussions, as I believe many seasoned developers in general are pretty stubborn… although I’ll admit that I have been engaged in pointless debates on more than one occasion. Admit it – you have, too!

One thing that I hardly ever hear about, though, is directory structure and file naming conventions for projects. In my experience, this has mostly been self-explanatory, or it will be documented on a per-project basis. Both of these are fine – there is something in place that makes it obvious where you should be placing your source files, and how you should be naming them.

Let’s fast forward to the beginning of my last place of employment. They weren’t really a dev shop, but they did produce software that was used internally. I was tasked to develop an internal tool that had a web front end that offered a drag-and-drop style interface to generate PL/SQL queries. I didn’t really want to use something like PHP (despite me having experience), and I had the opportunity to explore slightly more modern approaches, so I decided to run with AngularJS for the front end with a NodeJS (with Express running on top) API for the back end.

Obviously when you’re messing around with things, it doesn’t really matter where you shove files and as soon as things start getting a little bit bigger, something in your brain triggers and you start bringing some method to the madness. Your code starts to become more organised and you start organising file and directory structure.

Good one, Dave, you’ve just described every project, ever.

Hold up just a second. There’s just one thing that is bugging me. Why is doing this so damn hard for AngularJS?

The one question repeating itself in my head…

Why isn’t this a solved problem?

I have no idea.

I’m not really here to force what I believe to be an acceptable layout for an AngularJS project, but rather would prefer generating disussion on what may be better and why are things like this so difficult. Is this becoming the same sort of conundrum as coming up for effective names for variables?

The goals I want for any project structure are:

  • There must be consistency in the structure
  • I must be able to find what I am looking for quickly
  • I must be able to find the code I am looking for based on file names and file location
  • It has to scale

The first two points do happen naturally in a lot of projects. It’s the third point that seems to be where things deteriorate. The file name and location should be a very strong indication as to what is contained in the file. This sounds so obvious, but far too often I see files with either non-descriptive names, or I can’t even find a file name that looks like it might contain what I want.

Let’s start off with something basic… Oh yeah, that is totally a hamburger.

basic

So this one is pretty easy. We have all our JS in app.js, our AngularJS library in angular.js and our markup in index.html.

Enter problem one. As soon as the page gets any more complex than having one main view and one controller, things start to get large/hard to read. We also have the problem where we are mixing directives/controllers/services/config etc… all in the same JS file. You also have the problem of large amounts of associated markup winding in index.html.

You get it – this is going to get large, quickly.

Alright, so let’s split things out a bit.

slightly less basic

Okay, so now we’ve got all controllers in the controllers directory, all the services in the services directory etc…

Enter problem two. What happens when my project has heaps of controllers/services etc… I now have directories that could have a lot of files and none of them are particularly easy to find.

Surely we can do a little bit better than this. Let’s also just make things slightly more real by adding in a bower config file, a stylesheet and also assume the user wants some URL routing.

getting there

We’re starting to see some structure – finally.

We have assets moved away from the app and we have the app files in their own area of the project. app.js contains module declarations and run/config code and routes.js contains our url routing config.

We still haven’t quite ticked off my third criterion that I mentioned before:

I must be able to find the code I am looking for based on file names and file location

Instead of just having directories containing all of the controllers/services/etc… for each components, how about we split things into their respective components?

We’ll also assume at this stage that the developer wants to have separate views for each controller. I’m calling them views here, but feel free to call them partials, templates, woozawozzles – whatever, as long as there is consistency.

structure

Hey – look at that! Things aren’t looking too bad now. Let’s revisit our criteria…

There is consistency with naming. This one is pretty self-explanatory.
We can find what we are looking for quickly. If I want to find the controller for the foo component, I know exactly where that is going to live.
I know that the controller logic for the foo component is going to be in the foo.controller.js file.
It doesn’t matter how many extra components you add to the application, you aren’t going to be overwhelmed at all.

Obviously arguments can (and probably will) be made against what I have just claimed, such as “what do we do with shared/common code?” to which I could respond with “add a common directory in either app/ or components/ but I think that it won’t necessarily lead to a healthy discussion.

You could also also point out that what I’ve been suggesting looks very similar to Adnan Kukic’s article on this topic – and it is. I have read that article many times and I agree with a lot of it. However, despite Adnan presenting a very solid case, I simply don’t see his suggestions being used widely.

I really want the following:

  1. I want to know why there isn’t a widely accepted directory/file structure for AngularJS projects. I’m not sure “AngularJS is new and hasn’t had time to mature” is an answer I can swallow.
  2. How do we get to the stage where this is no longer a problem. I don’t really believe that every project is different enough to warrant an entirely new scheme.

Thank you for taking time to read my ramblings on this topic. For some reason it interests me more than it probably should.

I would really love to see some sort of discussion on people’s approaches on how they structure their web projects and why they think that we aren’t at a stage where there are more widely accepted approaches to this problem.

-Dave


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/ypP3BFpzaD0/angular-project-structure

Original article

Popular Firefox Add-Ons Open Millions To New Attack

An anonymous reader writes: Security researchers claim that NoScript and other popular Firefox add-on extensions are exposing millions of end users to a new type of vulnerability which, if exploited, can allow an attacker to execute malicious code and steal sensitive data. The vulnerability resides in the way Firefox extensions interact with each other. From a report on SlashGear, “The problem is that these extensions do not run sandboxed and are able to actually access data or functions from other extensions that are also enabled. This could mean, for example, that a malware masquerading as an add-on can access the functionality of one add-on to get access to system files or the ability of another add-on to redirect users to a certain web page, usually a phishing scam page. In the eyes of Mozilla’s automated security checks, the devious add-on is blameless as it does nothing out of the ordinary.” Firefox’s VP of Product acknowledged the existence of the aforementioned vulnerability. “Because risks such as this one exist, we are evolving both our core product and our extensions platform to build in greater security. The new set of browser extension APIs that make up WebExtensions, which are available in Firefox today, are inherently more secure than traditional add-ons, and are not vulnerable to the particular attack outlined in the presentation at Black Hat Asia. As part of our electrolysis initiative — our project to introduce multi-process architecture to Firefox later this year — we will start to sandbox Firefox extensions so that they cannot share code.”


Share on Google+

Read more of this story at Slashdot.


Original URL: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/GhqoQZiVf9U/popular-firefox-add-ons-open-millions-to-new-attack

Original article

How the Commodore Amiga Powered the Prevue Channel

The Amiga Commodore 1000. (Photo: Kaiiv/Pixel8/CC BY-SA 3.0)

A version of this post originally appeared on the Tedium newsletter.

In terms of planning our lives around what our TVs spit out, we’ve come a long way from the overly condensed pages of TV Guide.

In fact, the magazine was already looking awful obsolete in the 1980s and 1990s, when cable systems around the country began dedicating entire channels to listing TV schedules.

The set-top box, the power-sucking block that serves as the liaison between you and your cable company, is a common sight in homes around the country these days.

But before all that was the Commodore Amiga, a device that played a quiet but important role in the cable television revolution.

The Amiga was a much-loved machine, huge among a cult of users who embraced its impressive video and audio capabilities, which blew away every other platform at the time of its release.

As a multimedia powerhouse, it was ahead of both the Apple Macintosh and the IBM PC by nearly a decade at the time of its 1985 release, and its launch price was a relatively inexpensive $1,295, making the computer a bit of a bargain at launch. And seeing as “Amiga” is the Spanish word for friend with a feminine ending, it was also friendlier than its office-drone competitors.

(Ars Technica has a long-running series about the Amiga. When you get a chance, dig in.)

For cable providers, the Amiga’s capabilities for displaying content on a television were a bit of a godsend. Previous offerings, such as the Atari 800, were able to put messages onto a television screen, though not without much in the way of pizzazz.

The Atari 800. (Photo: Matt Chan/CC BY-ND 2.0)

As a result, the Amiga quickly became the cable industry’s computer of choice in the pre-HDTV era, especially after the release of NewTek’s Video Toaster in 1990. Video Toaster, which at first was only compatible with the Amiga, made it possible to do complex video editing at a small fraction of the cost of specialized professional video-editing platforms, and that made it popular with public-access TV stations.

“We showed the Toaster recently at the National Association of Broadcasters show—this is where the engineers go to buy their stuff—and they came by the booth. They said things like, ‘This is unbelievable, this is revolutionary that you can do this in a box for this price,'” Newtek’s Paul Montgomery said in a 1990 interview with the PBS show Computer Chronicles.

Video Toaster was so successful that it upstaged the Amiga platform as a whole, particularly in the United States. While Commodore shut down in 1994, Newtek is still around.

If you subscribed to cable television in the ’90s, you most likely saw Video Toaster in action on the cable dial. But the most notable use of the Amiga in cable television didn’t actually rely on Video Toaster at all.

That was the Prevue Guide, which may not have gotten the attention of the MTV, TBS, or Nickelodeon in those days, but served an important purpose: It was the channel you watched to see what was on those channels.

More than 40 million people had access to the Prevue Guide in 1995, and by telling you what was on the air, it played a utility role nearly as important as your remote control.

The channel, run by the United Video Satellite Group (UVSG), had its roots in the Electronic Program Guide (EPG), an ’80s-era channel programming platform that initially ran on a network of Atari 800 computers. (Here’s what it looked like in 1987.)

Later, it moved to the Amiga, and during its mid-’90s heyday, it represented the largest single way that most people were made aware of the Commodore Amiga in the United States—particularly as many cable providers ran their own local Amigas with channel listings.

Like Newtek and its Video Toaster, UVSG was more successful with the Prevue Channel than Commodore itself was with the Amiga.

When you stumbled upon a Prevue Guide channel, you always knew what you were getting—a constant scroll of the channels so consistent that you could set your clock to it.

In the ’90s, the channel was known for its split-screen setup: The top half of the screen was used for a variety of previews for premium channels, as well as infomercial-style advertising, especially for psychics. The bottom half, meanwhile, was like a spreadsheet of constant data.

The Amiga screen. (Photo: Les Chatfield/CC BY 2.0)

And we can’t talk about this service without mentioning the synth-heavy melodies that would run on this platform at all hours of the day. A YouTube user, PrevueChannelMusic, has gathered many of these Muzak-style tunes for your listening pleasure at any time. Wear headphones.

But ultimately, the technology was fallible, and it, too, is something cable fans remember the Prevue Guide for. The Amiga was known for its unusual error messages, particularly “Guru Meditation,” which was its equivalent of the Blue Screen of Death. Sometimes, the error would appear while the Prevue Guide was playing, which led to some interesting on-screen displays. And since this was TV, everyone in your neighborhood would see these displays.

This kind of thing seems like the kind of thing that’s so mundane and obscure that you’d think nobody would ever talk about it online, but you’d be wrong. In fact, one of the biggest fans of the Prevue Guide is kind of famous.

Ari Weinstein was just five years old on the day the Prevue Channel switched formats to the TV Guide Channel, a point that many fans of the network think the channel lost the plot.

Despite that, the channel played a key role in inspiring his still-budding career as an app developer.

Weinstein is one of the creators of two popular apps, the iOS automation app Workflow and the Mac app DeskConnect. In January, Forbes added him to their 2016 30 Under 30 class. He’s just 21, and in a lot of ways, he’s just getting started.

Like many programmers, he caught the coding bug while he was still in school. Unlike many programmers, he spent a good chunk of his teen years reverse-engineering old Amigas that ran the Prevue Guide in the ’90s. In 2009, he stumbled upon a manual for operating an old Prevue machine, and that led Weinstein down the ultimate rabbit hole.

“I started researching what Prevue was, and it brought back these memories and I just became obsessed with this old TV channel,” Weinstein explained in an interview. “I loved the branding of it, for some reason, and the appearance—a lot of stuff about it felt very ’90s and brought back good memories. But I also became fascinated with how it worked.”

From there, he discovered the history behind the machines—UVSG’s role as a vendor to cable companies around the country, what the company did with those machines after they passed their sell-by date (destroyed them, mostly), and how he could get his hands on a few.

“I actually found some of these units by painstakingly following all mentions of it on the Internet and reaching out to people,” he noted.

Weinstein didn’t just reach out to people, either. He connected them, launching a still-active forum to bring together a surprisingly large community of Prevue Guide enthusiasts, as well as a Wiki to discuss and organize the community’s findings around the platform.

He says that the fairly unusual hobby helped him as he moved from working on really old platforms as a teen to really new ones as an adult.

Before becoming a Prevue Guide enthusiast, Weinstein built a name for himself in the iOS jailbreaking scene by creating iJailbreak. Between his time reverse-engineering iPhones and reverse-engineering old Amigas owned by cable companies, he’s become very adept at using such techniques to figure out the ins and outs of the Mac and iOS operating systems, which DeskConnect and Workflow bend in interesting ways.

“I actually learned a lot on the technical side from reverse engineering the Prevue software, which has helped me a lot in other stuff I’ve worked on since,” Weinstein explained.

A screengrab of Ari Weinstein’s Prevue Guide Wiki. (Photo; Screengrab)

These days, Weinstein and other enthusiasts are doing more to keep the memory of the Prevue Guide alive than the companies that currently own the channel, CBS and Lions Gate Entertainment.

Like much of the cable industry, the channel has gone through a variety of shifts over the years. The biggest one came in 1999, when United Video Satellite Group paid to purchase TV Guide from News Corp.

The purchase, which came three years after a failed partnership between the two parties, helped spell a key turning point for the Prevue brand and its underlying technology, which helped drive listings and pay-per-view systems around the country.

Within months, the Prevue Channel was rebranded to the TV Guide Channel, and soon after the merger, the company moved away from the Amiga platform for good, switching to Windows and an arguably-uglier format.

It wasn’t long before cable boxes made the listings obsolete, and by 2008, ownership of the channel and magazine were separated once again, with the company Rovi currently in possession of the patents and technology that drove the scrolling TV listings.

By 2014, the TV Guide Channel changed its name to Pop, reflecting a final move away from listings.

A Commodore Amiga 500. (Photo: Quagmire’s Photos/CC BY-ND 2.0)

Part of the reason the Prevue Guide faded from view was due to the significant changes in the cable landscape. If you go back in time 20 years and compare the channels on the air then with now, you’ll see very few similarities. Two decades ago, AMC stood for “American Movie Classics” and was closely associated with early Hollywood films, rather than zombies. In 1996, WGN and TBS were still local television networks that just happened to distribute their signals nationally.

Back then, the cable industry was still young enough where it could rely on industry-specific technology, rather than trying to make it work with consumer-standard gear.

Using obsolete technology is far from something that is specific to television, of course. Just ask the ATM industry, which used IBM’s OS/2 operating system for nearly two decades after its commercial demise.


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/gMwmR9Pkr64/how-the-commodore-amiga-powered-your-cable-system-in-the-90s

Original article

Phoenix Framework dockerfile

README.md

Phoenix Framework dockerfile

clone phoenix-docker

git clone https://github.com/indatawetrust/phoenix-docker.git && cd phoenix-docker

build image

sudo docker build -t phoenix-docker .

run container

sudo docker run -it -p 4000:4000 –name server phoenix-docker

start container

sudo docker start server

container bash

sudo docker exec -it server bash

start phoenix server

elixir –detached -S mix phoenix.server

localhost:4000


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/TW1WN-ZaNYo/phoenix-docker

Original article

Bad Lawyering, Not Bad Forms

As Robert Ambrogi has been reporting, the folks over at Avvo announced that they were launching a new legal forms offering that would compete with LegalZoom. Mr. Contract himself, Ken Adams, reviewed an Avvo form and concluded that Avvo was another of the “hack vendors” that was “foisting crap” and “dreck” on consumers. And Avvo responded to Adams’s “silliness” in a way that suggests to me that we are witnessing two different debates. Both debates are worth exploration.

Legal Forms are a Bad Idea


Avvo wants to debate the merits of consumer-facing legal forms. The basic outline of this debate is fairly well settled:


Should lawyers create legal forms? Yes. Anytime that a lawyer repurposes old product–which happens all the time–they are making the case for some form of document assembly or automation. If you have a good indemnification clause it is plain stupid to try to draft a new one from scratch.

Should consumers use legal forms? Sometimes. We generally don’t have lawyers around when we are filling out form contracts to lease homes, buy cars, or license software. When the need is straightforward, most people are sufficiently adept at filling out basic forms. Even if they aren’t, lawyers are cost prohibitive.

Isn’t there a danger? Sure. Not every situation is straightforward. The untrained person is more likely than the trained person to make a costly error.

This is where the debate normally heats up. The question becomes where to draw the line. At what point is the provider of the form handing the consumer something too likely to lead to self-inflicted harm. Avvo was prepared to respond to the standard criticisms.

First, they point out that their target audience is people who are already inclined to use forms rather than a lawyer. Second, they explain that the purpose of their free forms is to “upsell” consumers– i.e., convince the consumers to pay for assistance from a lawyer through Avvo Legal Services.

At worst, Avvo is providing a free service to someone who was not going to pay for a lawyer under any circumstances. The implicit suggestion seems to be that their free service is better than what the consumer would have otherwise done or, at least, just as good as the forms that the consumer would have paid some small amount for at LegalZoom.

Avvo not only concedes the common criticism of forms–most people would be better off if they consulted an attorney–its business model is based on convincing consumers of that premise. Avvo’s CEO Mark Britton referred to DIY as a “virus” and is adamant that you cannot compare mere forms to the bespoke work product of a trained lawyer:

“This is just silliness. The point that is being missed here, is that you have over 50 percent of people who have money and are potential clients but who are not using lawyers. You have this explosion of DIY that is like a virus. The question is how do you get in front of those people who want to do it themselves. Even though they say they want to do it themselves, they don’t really mean that. You cannot compare a bespoke product from a lawyer that will cost you thousands of dollars to a product that is an entry-level product designed for people who are doing everything they can to avoid a lawyer. Let’s get them that product and then start the conversation from there.”

The debate that Avvo is engaged in:

1. Whether the provision of free forms is more likely to convince consumers to use lawyers

2. Whether consumers who are not going to pay for a lawyer under any circumstances are better off with access to free forms

Avvo answers in the affirmative to both.

Forms are Fine, Lawyers are Bad


Ken Adams is having an entirely different debate. He is stating that Avvo’s forms are “crap” on their own merits. That is, he is comparing Avvo’s form to a good form, rather than to the bespoke work product of a good lawyer.

Adams, however, is famously less than impressed with what many lawyers produce. In a previous post, he subjected a LegalZoom contract to the same kind of scrutiny and came to a similar conclusion: “commoditized mediocrity.” He then added this gem:

It’s clear why business customers might want to try LegalZoom. Lawyers cost more than LegalZoom. Choosing a lawyer can be a crapshoot. And there’s a fair chance that an NDA produced by a lawyer you retain wouldn’t be any better than LegalZoom’s.

Let that soak in for a second. Adams is absolutely saying that the forms from Avvo and LegalZoom are mediocre. But he is also saying that a fair number of lawyers are just as mediocre, if not worse. Where I made the banal observation that it was obviously stupid to start from scratch if you already have a good indemnification clause, Adams would likely counter that the indemnification clause you have probably isn’t all that good and that most lawyers are incapable of writing one that is. As he writes in the Avvo post, “the quality failure of the consumer market is just part of the quality failure of contract drafting as a whole.”

Consider an analogous post where Adams takes the same critic’s eye to a two-page, simplified cloud contract for which IBM was getting accolades. Adams labels the contract the work of “dilettantes” and then lays out a case that most lawyers should leave contracts to the professionals (i.e., being a lawyer does not make one a contract professional):

What conclusion do I suggest you draw from my markup? That contract language is specialized—it’s best left to specialists. Knowing your company’s transactions doesn’t make you a specialist. And many years of being steeped in traditional contract language doesn’t make you a specialist. You become a specialist only by making a concerted and disciplined attempt to familiarize yourself with the building blocks of contract language, the good and the not-so-good.

If you’re not a specialist, you’re a dilettante. Those responsible for IBM’s new cloud services contract are presumably knowledgeable, enthusiastic, and hard-working, but when it comes to contract language, the shortcomings in the new contract suggest that they’re dilettantes. That’s to be expected. In fact, the contracts ecosystem would work better if contract language were left in the hands of a limited number of “legal knowledge engineers” (to use Susskind’s clunky but apt phrase) working closely with those who have a broader understanding of the business and legal issues.

Adams made similar comments in a post labeling the Google-Motorola merger agreement “a mediocre piece of drafting. It’s bloated and hard to read, and that takes a toll at every stage—drafting, reviewing, negotiating, and monitoring compliance. And there might be lurking in the verbiage some bit of confusion that metastasizes into a dispute down the road.” He then answers the question that almost anyone would ask.

Mediocre? How can that be! After all, Google is represented by the prominent law firm Cleary Gottlieb—presumably they did the bulk of the drafting. Well, the Google–Motorola merger agreement is mediocre because all big-time M&A drafting—or at least all that I’ve seen—is mediocre.

That should come as no surprise, seeing as the language of mainstream drafting generally is dysfunctional. That’s due to a mix of factors. The root cause is that because any transaction will closely resemble previous transactions, drafting has become largely an exercise in regurgitation, with most contract language being given a pass. Also, law firms aren’t suited to the task of retooling and maintaining template contracts. (For more on these factors, see my article The New Associate and the Future of Contract Drafting; go here for a PDF copy.)

But in addition, most of the M&A luminaries I’ve approached have made it clear that they’re wedded to old habits and conventional wisdom. Perhaps what makes M&A drafting particularly resistant to change is that clients are less inclined to meddle when it comes to “bet the farm” work such as the Google–Motorola deal.

The way to fix M&A drafting would be to turn it into a commodity process. Google, if you want your M&A contracts to be free of shortcomings of the sort manifest in the Google–Motorola merger agreement, I suggest that you enlist some like-minded companies and form a consortium to create a rigorous set of document-assembly M&A templates. You could fund it with spare change retrieved from your couch. Judicious use of the carrot and the stick would get leading law firms to participate. The work could be done quickly and efficiently. The basic idea should be familiar to you—after all, this month Google Ventures invested in Rocket Lawyer, which aims to commoditize, in a much more rudimentary way, some basic consumer and small-business documents.

[In a subsequent post, Adams reviews an actual contract from Rocket Lawyer. The title of the post, “Rocket Lawyer? Contract Automation FAIL“] 

Adams is not opposed to forms. Adams is about the staunchest supporter of forms you can find. He just believes that most lawyers lack the training to author first-rate forms. He is not saying Avvo, LegalZoom, and Rocket Lawyer forms are mediocre because they are forms. He is saying they are mediocre because they are mediocre. He reaches similar conclusions about the bespoke work product of lawyers hired by IBM and Google.

As Compared to What

Avvo’s position touches upon the IKEAization of law. Much of IKEA’s furniture is disappointingly serviceable. It works for the intended purpose. But it is made of cheap, fragile particle board. It has a high propensity to break and is notoriously painful to put together.

Yet, many of us shop at IKEA anyway because it is substantially less expensive than traditional furniture. Should consumers be permitted to make the same tradeoff when it comes to legal services? Slightly worse but radically cheaper.

It’s an important question for every legal consumer, including in-house counsel who are not only under pressure to consider less expensive alternatives to traditional law firms but should always keep in mind the lessons of Do Less Law. Budgets are finite, and resources should be put to their highest and best use. Tradeoffs are unavoidable.

But the question of slightly worse at substantially lower cost is of particular significance for consumers who cannot otherwise afford legal services. The access-to-justice gap is not going to close because we talk about it endlessly. Beginning to close the access-to-justice gap means actually making the structural changes that would provide more access to justice.

But the whole IKEAization discussion rests on an implicit comparison. We know, for example, that the Avvo and LegalZoom forms are cheaper. We can do that math. But do we really know whether they are worse than what the consumer would have gotten from the lawyer they would have hired (if they could have hired one). The instinctive answers seems to be that, yes, we know the expensive human lawyer will outperform the inexpensive (or free) form. Adams, however, calls into question our knee-jerk reaction. And even if the forms are worse, the issue of how much worse matters quite a bit in a world of tradeoffs. Dangerous and suboptimal are different conclusions with different implications.

I would interested to hear how crowds of lawyers react to Adams if/when he tells them that most of them are bad at contract drafting. According to Bryan Garner, they “bristle” when tells them that, “on the whole, our profession can’t punctuate.”  Garner, the authority on legal writing, does more than remark on poor comma usage [so guilty!], he tells rooms full of lawyers that they are bad at writing in general:

For many years in lectures, I’ve likened practicing lawyers, when it comes to writing, to 23-handicap golfers who believe that they’re equal to the touring professionals. For those not golfers, this would mean that pretty poor golfers—those who habitually shoot in the mid-90s but benefit from the big handicap—somehow fool themselves into believing that they really are shooting in the mid-60s, and that they’re about as good as it gets. I’ve been trying, in other words, to say that lawyers on the whole don’t write well and have no clue that they don’t write well.

In the quoted article, Garner discusses Dunning-Kruger, or illusory superiority. Ignorance begets confidence because of meta-ignorance–ignorance of our own ignorance. Because we don’t know what we don’t know, we labor under delusions of adequacy. We then erect those delusions of adequacy (or grandeur) as the standard against we measure all suggestions of departure from the reigning status quo. Legal forms are just part of a much broader discussion of what kind of work demands a human admitted to the bar in a particular state. Think of UPL regulations, humans as the “gold standard” in document review, the kind of work amenable to outsourcing, etc.

I write quite a bit about using process and technology to complement legal expertise. I spill most of my digital ink defending the complements–process and technology–and trying to explain how they augment or leverage the expertise. Maybe I need to spend a little more energy questioning the implicit assumption that the expertise is all that expert.


Original URL: http://feedproxy.google.com/~r/geeklawblog/~3/BVJ0aywRTkI/bad-lawyering-not-bad-forms.html

Original article

Followup on features

In the last few days I’ve posted two feature requests:

  1. A Node.js app to browse a server’s file system,
  2. way to tell the forever utility to save a snapshot of the current mix.

Happy to report that in a Facebook thread I got a pointer to Cloud Commander from Hanan Cohen. It seems to be exactly what I asked for.

I’ll let you know if we get the forever enhancement.


Original URL: http://scripting.com/2016/04/08/1171.html

Original article

WordPress.com turns on HTTPS encryption for all websites

Lock WordPress.com is adding HTTPS support for all its blogs. If you have a custom domain or have a blog under the wordpress.com domain name (like bestcrabrestaurantsinportland.wordpress.com), you’re good to go.
While many social services like Facebook and Twitter have supported HTTPS for a while now, WordPress.com was still lagging behind for custom domain names.
Since 2014, WordPress.co Read More


Original URL: http://feedproxy.google.com/~r/Techcrunch/~3/U4GU9cgcJ1c/

Original article

WordPress.com turns on HTTPS encryption for all websites

WordPress.com is adding HTTPS support for all of its blogs. If you have a custom domain or a blog under the wordpress.com domain name (like bestcrabrestaurantsinportland.wordpress.com), you’re good to go.

While many social services like Facebook and Twitter have supported HTTPS for a while now, WordPress.com was still lagging behind for custom domain names.

Since 2014, WordPress.com subdomains have supported HTTPS, but not the others. But this isn’t as easy as flipping a switch for custom domain names as you need certificates for all domain names.

Thanks to the Let’s Encrypt project, it has become much cheaper and easier to implement HTTPS across the web. WordPress.com is taking advantage of this initiative for its websites. Each website now has an SSL certificate and will display a green lock in your address back.

As a nice side effect, Google tends to favor websites that support HTTPS over HTTP-only website. So your WordPress.com website should rank higher in Google search results.

I’m sure you all have a burning question. What do I need to do to activate HTTPS? In an Oprah-like moment, WordPress.com is activating HTTPS on all websites without having to do anything. You get an SSL certificate! Everyone gets an SSL certificate!

Featured Image: Montillon/Flickr UNDER A CC BY 2.0 LICENSE


Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/GZEx1izAuow/

Original article

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑

%d bloggers like this: