Automatic backups with standard UNIX-tools

A good backup routine makes you sleep good. I have been playing around for years looking for the perfect backup tools for my server and ended up close to where I started — with a couple of standard UNIX-tools.

So, I wanted a daily backup routine that archived selected content and saved it on a remote host, not incremental but by date. The tools I use to accomplish that is SSH, cron, tar. Yes, you have them already on your server so lets get started. If you do not have a RSA key, open a terminal, and type the following:

ssh-keygen -t rsa

Make sure that you leave passphrase empty, as follows:

Generating public/private rsa key pair.
Enter file in which to save the key (/home/matias/.ssh/id_rsa):
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /home/matias/.ssh/id_rsa.
Your public key has been saved in /home/matias/.ssh/

Now, store the public key ( on your target server, change the following command to fit your needs (user/host):

cat $HOME/.ssh/ | ssh user@host "cat >> .ssh/authorized_keys"

Now you need a simple bash scripts that will archive content. Since my content is already compressed, I use tar to pack the source files with hostname and date, as follows:

DATE="`date +"%Y-%m-%d"`"
tar -cvf - $SOURCE | ssh user@host "cat > $TARGET"

Name the file for example and continue. Now, put this into your cron and choose when you want the script to run (below is nightly at 04:30).

30 4 * * * bash /home/matias/ > /dev/null 2>&1

The reason I pipe the tar file to SSH is that I do not want to store the file temporary since it could be big. Second is that I do not want to use SCP since the speed is notable slower. Enjoy!


7 awesome paid iPhone apps you can download for free right now

View gallery



After giving you our biggest list of the week on Wednesday, we’re back with another great list of paid iPhone and iPad apps that are currently on sale for free. There are still a few nice apps available for free in yesterday’s list if you hurry, but today we have a fresh new batch of sales for you to enjoy.

Just remember, these great deals could end at any time so be sure to get a move on if you want to take advantage of them!

DON’T MISS: Is the Galaxy S6 really an iPhone ripoff? Leaked comparison photo lets you be the judge

These are paid iPhone and iPad apps that have been made available for free for a limited time by their developers. There is no way to tell how long they will be free. These sales could end an hour from now or a week from now — obviously, the only thing we can guarantee is that they were free at the time this post was written. If you click on a link and see a price listed next to an app instead of the word “get,” it is no longer free. The sale has ended. If you download the app, you will be charged.


View gallery


Normally $0.99.

LightBoxr is the next evolution in photo editing applications. Lightboxr is packed with retouch tools, special effects, and filters that makes your photo look more gorgeous.
You can even make your own recipes and handle a bunch of images easily. ’Shuffle’ ‘Multi-Edit’ also can handle tons of your photo faster than any other apps with single touch and free you from the repetitive works.

This app supports professional-grade photo editing functionality but is designed for the casual iPhone user.

– Shuffle: Randomly apply new effects to your photo with one touch
– Recipe: Find a look you like? Save it as a favorite and re-apply to your next photo with one touch
– Multi-Edit : You can load your max 10 images at once and apply it with awesome effects.

– Full Manual controls for Exposure (ISO, Shutter Speed), Focus(Auto, Manual) White Balance (Preset, Manual).
– Volume Shot
– Timer
– Horizon Indicator
– Front Rear Camera Selection
– 3×3 Grid
– Flash

– 66 Filters
– 17 Light Leaks
– 15 Noises
– 14 Frames/Borders

– HDR (Supports 2 Types)
– Tilt Shift (Supports 2 Types)
– Sharpen (Supports 2 Types)
– Vignette

– Brightness (Fade)
– Contrast
– Saturation
– RGB Channels
– White Balance
– Temperature
– Shadow Adjust
– Highlight Adjust
– Exposure
– Gamma
– Vivid Environment (Vibrance)

– Straiten
– Crop
– Rotate
– Flip (vertical horizontal)

Share with your friends
– Export to Instagram
– Facebook
– Twitter
– Tumblr
– Flickr

Download LightBoxr

Calendar Widget Lite

View gallery


Normally $2.99.

See the current month Calendar and your events for the next 7 days without open a new app.
Put a small calendar in your Today Notification Center to display events from Calendar.
– Choose the first day of week.
Easy and simple.

Check the help page if you need assistance or send a email to

The full version also includes:
– See past, current and future events from your Calendar.
– Personalize with 10 themes.
– Choose your own custom background.
– 3 widgets.
– 2 styles: flat and default.

Download Calendar Widget Lite


View gallery


Normally $0.99.

The best and simplest photo editing studio is now available on the App Store!

With Phot.oLab you can add beautiful artwork, backgrounds, stickers typography.

Apply special filters, shapes, textures, photo effects overlay masks to your favorite photos. Using the advanced adjustment module you can customize your pictures exactly how you want to look like.

Brand new content including fonts, shapes, filters, overlays and much more added monthly!


√ Typography
– Add your favorite quote to your pictures from a beautiful collection of custom fonts
– Resize adjust the opacity of your text

√ Stickers artwork
– Select your favorite artwork, stickers and overlay masks from a various collection and add them directly to your photos.
– Adjust the opacity of your stickers artwork

√ Textures
– Apply wonderful textures from a great collection.
– Adjust the opacity of your textures
– Rotate your textures to your desired position

√ Shapes
– Choose a shape to add to your photos
– Adjust the opacity of your shapes
– Change the color of them

√ Professional Photo Filters
– Use the best filters from a unique collection available

√ Advanced Adjustment Module
– Adjust your picture brightness, contrast, saturation and exposure from the most completed adjustment module.

√ Other details
– Select from 3 different themes to customize
– Export your photos to directly to Photo Library, share them from Instagram, Facebook, Twitter, E-Mail, WhatsApp, SMS or open them from other apps

Download Phot.oLab


View gallery


Normally $0.99.

Start runs quickly. Get moving and watch yourself improve. Detailed run tracking helps you lose weight or stay fit. Train for nearby races, then make the commitment and register. Go for a run with Runr.

Featured by Apple in Health Fitness “Apps For Your Run”.

No delays. Stop fiddling with settings, presets, and playlists – just GO. It’s easy to see how far, fast, or long you’re going during a run (no more squinting or stopping to see the screen during a run – YAY!).

Train for your next Marathon or 5K with Runr. View races in your area, get the scoop, see who else is going, and then make the commitment: REGISTER!

Check your progress and watch yourself improve through beautiful distance and pace graphs. See past routes, distance, time, calories, and pace. Runr works with Apple Health on iOS 8 so you can get an overview of your health with the Health App.

Your run details, route, location, and personal information is never shared or uploaded or broadcasted. Everything is completely private. Unlike other running apps (which broadcast your location on social media and across various servers), Runr gives you complete control over your data.

Hear spoken feedback during your run about distance, time, calories, and pace. It works great when your iPhone is locked too. You’ll even get updates via Notification Center during your run.

As you run, you’ll earn achievements and climb leaderboards. Runr has 25+ achievements that encourage more running, at different times, and in different places. Climb seven different leaderboards to compare yourself globally and with your friends.

– Unlimited Run Storage (limited to 10 runs without Premium)
– Unlimited Nearby Races (filtered at 5 races without Premium)
– In-Run Music Playback
– Pace Colored Routes
– All Future Updates (iCloud Syncing and Goals coming soon)

Runr Premium is available for $0.99/month or $3.99/year. You can subscribe and pay through your iTunes account. Your subscription will automatically renew unless cancelled at least 24 hours before the end of the current period. Auto-renew may be turned off by going to your Account Settings after purchase. A free-trial period may be provided and can be optionally forfeited by starting the service earlier than scheduled.

Download Runr

iDownloader Pro

View gallery


Normally $2.99.

We believe that downloading to your iPhone, iPod touch or iPad should be as easy as it is on your computer. That is why we created iDownloader Pro.

Now you can easily download files from the web to your i-Device and then view/play the downloaded files.

iDownloader Pro Key Features:

√ Web browser

– Built in Ad-Blocker!
– Multi-tab Safari-like browser
– Tap and hold to force download
– Full screen mode
– Bookmark manager
– Ability to spoof browser’s user-agent
– HTTP authorization support
– Private browsing
– Search History
– Browsing history
– UTF addresses support
– Pop-UP blocker

√ Web Password Keeper integrated with in-app Web Browser (NEW!)

– Allows to Login into any website in just 3 taps
– Saves and manages as many user profiles for each website as you want
– Exclusions list allows switch off saving your passwords for selected websites
Note: to start the Web Password Keeper, please switch it “On” in the app’s Settings

√ Download manager

– Ultra fast downloading speed
– Up to 50 simultaneous downloads
– Download in background mode (10 min max due to iOS restrictions)
– Supports resumption of interrupted downloads
– Download Files over 3G

√ Media player

– Background Playback
– Video playback of mp4, m4v, mov, 3gp m3u8 videos
– Thumbnails view
– Export videos to Camera Roll
– AirPlay support (iOS 4.2 or above)
– Playlists
– Save the Video play position

√ File manager

– Dropbox Integration
– Folder and sub-folder support
– Move, rename and delete files
– Sorting by name, type, size, date
– Extract zip rar archives
– Sort files in ascending or descending order

√ File viewer

– Full-Featured Photo Viewer with slideshows
– Full-Featured Document Viewer that supports .pdf, .doc, .xls, .ppt, .txt, .html and .rtf file formats
– Video Viewer with Slideshow capabilities
– Open Files in other apps

iDownloader Pro is constantly updated with new features. If there is anything you would like to see in the application, just drop us an email through the “Send Suggestion” link in the app’s “Settings” and we will do our best to add this feature in our next app update.

Download iDownloader Pro


View gallery


Normally $1.99.

Weightbot is a weight-tracking robot. Whether you are trying to lose (or gain) weight, tracking your progress has never been more fun. Set your goal, record your weight, view your BMI, and see your progress on a beautiful graph. Weightbot was designed for everyone to use.


* Input your weight up to once a day with a streamlined user interface
* Easily change or remove your weight from any day
* Instantly view your BMI as you set your weight
* Set your goal weight to know where the finish line is
* Rotate clockwise to view your weight over time on a beautiful and easy to read graph
* Rotate counterclockwise to view the goal view. It let’s you see you quickly see your progress towards your weight loss/gain goals
* Toggle units between lbs/kg for weight and ft/cm for height
* Set a numeric passcode to protect your data from being viewed by others

Visit for demos and more information.

Download Weightbot

Printer Pro

View gallery


Normally $6.99.

Print attachments, documents, web pages and more right from your iPhone and iPad to any Wi-Fi or USB printer.

Printer Pro lets you wirelessly print from the iPad or iPad. It can print directly to many Wi-Fi printers or any printer attached to your Mac or PC via helper application installed on your computer.

Once installed, Printer Pro appears in the “Open In…” list on your device. This lets you print documents from Mail, PDF Expert and many other applications on your iPad that supports this function.

Using “Open In…” approach you can print files from many popular online storages: Dropbox and Google Drive. It just a matter of several taps to download your file via free Dropbox or Google Drive application and send it to printer.

To print a web page, just change “http” to “phttp” in the address bar in Safari and tap Go. The page will immediately be opened in the Printer Pro with print button right above your finger. You can print web based documents as well using this approach.

With Printer Pro you can print:

– Email Attachments

– iWork documents

– Web pages

– Files from other applications

– Clipboard content

– Photos

– Documents on Dropbox and Google Drive

– Contacts

◆ Printer Pro Desktop
Get the free helper application for your computer to print more document types and with better quality. You can download it at

◆ List of supported document formats
PDF, Word, Excel, Powerpoint, Pages, Numbers, Keynote, TXT, HTML, JPG, Safari webarchive

Download Printer Pro

More from BGR: Is the Galaxy S6 really an iPhone ripoff? Leaked comparison photo lets you be the judge

This article was originally published on

Article source:

Protect our climate and health, not multinational profits


Embargoed until 1pm, Friday 13
February 2015

Protect our climate and
health, not multinational profits

OraTaiao: The
New Zealand Climate and Health Council strongly supports
this week’s call by international health advocates for the
public release of the Trans Pacific Partnership Agreement
(TPPA) negotiating text, for a health check.

This week’s
edition of world-leading medical journal The Lancet
includes a call by 27 health experts from New Zealand,
Australia, Canada, Chile, Malaysia, the USA, and Vietnam
(including leaders of the World Medical Association and
World Federation of Public Health Associations) for the TPPA
to be made public so its overall health impacts can be

“Leaked documents indicate the TPPA will have
far-reaching implications, including undermining our ability
to protect our climate and the future health of New
Zealanders – yet the entire agreement is still being kept
secret from the public,” says OraTaiao co-convenor Dr Rhys

“The biggest threat is the new ‘Investor State
Dispute Settlement’ (ISDS) clause. This means overseas
companies will be able to sue our government if they suspect
any law change might threaten their profits. Yet our
government will not be able to sue those companies for
damage to our climate, health and economy.

“Under the
TPPA, the New Zealand government could find itself hamstrung
in efforts to reduce climate damaging emissions and to
promote health. Overseas governments are being sued for
millions of dollars as a result of similar provisions in
other trade agreements. For example, Germany faces legal
action over closing a nuclear power plant and reducing
emissions from a coal-fired power plant

“If governments
can face costly legal action over simply protecting the
health of their people, we urgently need to run a health
check on what’s being negotiated for New Zealand.

irony is that this same week in Geneva talks continue
towards international agreement on climate action, and 13-14
February marks Global Divestment Day as the world
increasingly turns from fossil fuels towards clean renewable
energy. Yet our government is secretly locking New Zealand
into an unhealthy deal to protect corporate profits.”

Jones says negotiating documents in a similar trade deal
between the EU and the US are now revealed, after the EU
Ombudsman ordered their public release.

“This sets an
important precedent that TPPA countries can and should

“The world’s expert climate scientists have
told us we need to rapidly move towards a low or zero
emissions economy. It’s time to protect our climate and
our health, not multinational profits,” Dr Jones



Rhys Jones (Ngati Kahungunu) is a Public Health Physician
and Senior Lecturer at the University of

Auckland. He
co-convenes OraTaiao: The NZ Climate and Health

The call by 27 health leaders and experts, from
seven Pacific Rim nations, will be published in the Saturday
14 February 2015 print issue of The Lancet, and is
available online from Friday 13 February 1pm NZDT at
under ‘Correspondence’. Lead signatories/co-authors are
from New Zealand and Australia.

OraTaiao: The New
Zealand Climate Health Council
are senior doctors
and other health professionals concerned with climate change
as a serious public health threat. They also promote the
positive health gains that can be achieved through action to
address climate change. See:

According to the
latest expert climate science (IPCC AR5 WG1,,
for a two-thirds chance of staying within the
internationally agreed limit of 2oC warming of our planet,
we need to emit less than half a trillion tonnes of
greenhouse gases over the next 35 years.*

to give a 66% chance of staying below 2oC, the IPCC
calculated 1 trillion tonnes as the maximum amount of CO2
emittable over the industrial period; by 2011 the world had
already used 515 billion tonnes CO2 of that budget, hence
485 billion tonnes remaining;

Lord Nicholas Stern’s prominent report on climate change
(2007) showed how failing to act on climate change will
produce profoundly greater costs and damage to the economy
and human health, but probable economic gains by
moving quickly on emissions reductions. Lord Stern was UK
government advisor and former World Bank Chief Economist.
The Stern Review is available at

N. The economics of climate change: the Stern review.
Cambridge: Cambridge University Press, 2007.

Trans Pacific Partnership Agreement (TPPA) is a proposed
trade agreement between New Zealand, Australia, Brunei,
Canada, Chile, Japan, Peru, Mexico, Malaysia, Singapore,
United States, and Vietnam. It has been criticised globally
by health professionals, internet freedom activists,
environmentalists, organised labour, advocacy groups, and
elected officials, in large part because of the proceedings’
secrecy, the agreement’s expansive scope, and controversial
clauses in drafts leaked publicly.

The Lancet
statement calls on governments to publicly release the full
draft TPPA text, and to secure independent and comprehensive
assessments of health impacts for each nation (which
evaluate direct, indirect, short- and long-term impacts of
the TPPA on public health policy and regulation, publicly
funded health systems, the cost of medicines, and health

Last year over 250 senior New Zealand health
professionals signed an open letter of concern to the Prime
Minister about the risks of the TPP (
Their concerns included the way that trade agreements could
stifle laws to protect against hazardous substances (such as
plain packaging on tobacco), interfere with environmental
health and safety legislation or block necessary controls on
excessive use of drugs manufactured by transnational

Several leaks of the TPPA environment chapter,
which were posted on Wikileaks in January and again in
February 2014, indicate that it will do nothing to protect
New Zealand’s right to introduce new measures to address
climate change (

More detail on investor state clauses and the
implications for New Zealand can be found at
This includes Germany currently facing two investment
disputes brought by Swedish firm Vattenfall.*

*(After the
Stern report on climate change, Germany took measures to
reduce the damaging effects of carbon dioxide emissions from
a coal-fired power plant owned by Vattenfall, and this is
being challenged. Then Germany closed a nuclear power plant
following the Fukushima disaster, and this too is being
challenged. These challenges do not claim that the power
plants are safe, but that investors are losing

Negotiating documents in the Transatlantic Trade and
Investment Partnership between the European Union and the US
have recently been publicly released, following the EU
Ombudsman’s order to make these public (

latest session of the Ad Hoc Working Group on the Durban
Platform for Enhanced Action (ADP2.8) is underway from 8-13
February in Geneva, Switzerland ( This
meeting is part of the pathway to Paris in December this
year to reach agreement on global climate action at the
Conference of the Parties. Over the next few months New
Zealand will be required to submit our plan to reduce
climate-damaging emissions.

Friday 13 and Saturday 14
February mark Global Divestment Day, with events organised
around the world, including New Zealand, to accelerate the
global trend to withdraw from investment in fossil fuels (

© Scoop Media

Article source:

30 charged so far in Christmas drink and drug drive campaign

30 charged so far in Christmas drink and drug drive campaign

First published

in News

DORSET Police’s Christmas drink and drug drive campaign has seen 30 people charged since it began on December 1.

The campaign, which comes to an end on New Year’s Day, has seen officers carry out a crackdown over the festive season with increased patrols and roadside checks, while all drivers involved in collisions have been breathalysed.

Convicted drink drivers, who face at least a fine and lengthy driving ban, are also being named and shamed in the Daily Echo and Dorset Echo.

Inspector Matt Butler of Dorset Police’s Traffic Unit has urged people to call the police on 101 or 999 immediately if they think someone is drink or drug driving, or call Crimestoppers on 0800 555111.

Article source:

Everything you ever needed to know about the Terminal

I’ve been using the Unix command line since 1983 and like most software developers, the Terminal app is a permanent fixture in my Dock. Over the years I’ve learned a lot of things that make working in this environment more productive, but even old dogs like me are constantly learning new tricks.

As much as I love them, these long “trick lists” on Stack Overflow have a problem: they’re poorly organized with little narrative describing why you’d want to use a technique. This long homage to the command line is my attempt to remedy that situation.

Note: I originally learned the shell using the newfangled csh (which was a huge improvement over the original sh.) When I first started using Mac OS X, I tweaked it to use tcsh because that’s what I knew and loved. Over time, I gave up using these tweaks and started using the default shell: bash. The following examples assume that you’re doing the same.

Editing Keys

My most recent discovery, and the one that made me realize a post like this would be helpful to a lot of fellow developers, was the revelation that an option-click in the Terminal window emulates arrow keys.

Say you’re entering the following in the command line:

$ echo "this is a test"

Oops! You left out “awesome” and now find yourself tapping the left arrow a bunch of times before entering the missing word. The pain increases linearly with the length of the command line you screwed up. If your Mac keyboard is anything like mine, the arrow keys are shiny from constant wear and tear.

Luckily, you can give those keys a bit of a rest with this simple trick: try Option-clicking on the first letter in “test”. The cursor is just where you want it and you haven’t touched the arrow keys!

To get a better idea about how this works, try this:

$ cat -v

Now, press the arrow keys. The terminal emulator sends an escape (^[) followed by an open bracket and then A for up, B for down, C for right, and D for left. Now hold down the Option key and click with the mouse: the Terminal app just emits arrow keys to move the cursor block between the source and destination locations. This means it also works in tools like vi or even the Xcode debugging panel: a huge time saver just got even better!

(Use Control-C to get out of this echo mode and back to your shell prompt.)

While we’re on the Option key, you can also hold it down while using the left and right arrows to move the command line cursor by a full word instead of a character. Which is exactly what you need when your editing a path with a missing directory name.

The command line also responds to control keys. The ones I use the most are Control-A and Control-E to move to the beginning and end of the line. Control-U and Control-K are also useful to delete text from the cursor to the beginning and end of the line buffer. I’ve heard that these are standard emacs key bindings, but can’t confirm this since I’m a vi LOVER NOT A LOSER

Note that the Control and Option keys also work in standard Cocoa controls. In a Finder window, you can use Command-Shift-G to change the folder path and then use same muscle memory that you’ve acquired in the shell.

For those really long commands, you’ll probably want to get into a more comfortable editing environment. Just use Control-X followed by Control-E to open the command buffer in your EDITOR. (More about setting up the EDITOR in just a bit.)

Another great key to know about is Tab. Try entering this:

$ ls /Vol

Press Tab once and it completes “/Vol” to “/Volumes”. Press Tab twice and you’ll see a list of all mounted volumes. Welcome to the Lazy Typist Club™.

Shell Setup

As I mentioned earlier, I no longer configure which shell I use on a new OS X install. I do however, change the bash configuration on every Mac I touch.

Every time a new Terminal window is opened a shell process is created with your current login. As the shell is initialized, the .profile file in your home folder is used to initialize the interactive shell. Basically, you can think of .profile as a bunch of typing you don’t have to do each time a new Terminal window is created.

(The name .profile dates back to the original sh. See “man bash” for a ton of more info and options.)

Here’s my .profile. I like to keep it fairly simple:

alias ll="ls -lahL"
alias con="tail -40 -f /var/log/system.log"

bind '"e[A":history-search-backward'
bind '"e[B":history-search-forward'

export EDITOR="vi"
export CLICOLOR=1
export XCODE="`xcode-select --print-path`"
export PATH="/Users/CHOCK/bin:$XCODE/Tools:$PATH

Let’s take a look at each group of settings.


Aliases let you define command shortcuts. Since I’m old and forgetful, there aren’t many. I used to have a lot of aliases, but found myself constantly using the alias command to list them out and find the right one. Which, of course, defeats the entire purpose of a shortcut.

The alias ll lets me list files in a format that is more readable, especially with the large files. I like using con instead of firing up the Console app (which is total overkill for most situations.) It should be pretty obvious how to create your own aliases. A lot of the command tricks you’ll learn below will be good candidates if you use them often enough.

Search Setup

By default, the shell allows you to use Control-R to search previous commands. After typing the control sequence, your command history is searched for each letter that you type. Cool idea, but in my opinion, it’s a terrible user experience. The reason is that my history is filled with entries that are very similar. If you have both “ssh CHOCKMASTER@domain1“ and “ssh CHOCKMASTER@domain2“, there’s just too much typing to get the right match.

The next two lines in my .profile solve this problem: the bind command tells the shell to do a history search when the up and down arrow keys are used. When the shell is in this mode, you can just type “ssh” and then use the arrows to select the command you want to run again. This fits my needs much better and feels more consistent with the shell’s default ability to move through the history with the up and down arrow keys if you haven’t entered any text into the edit buffer.

Environment Variables

Finally, there are the environment variables. Again, i’ve whittled it down to the bare essentials. The EDITOR variable is used by Control-X, Control-E in the shell and lots of other tools. You can change it to emacs, but then I’d laugh at you.

The CLICOLOR variable is used by the ls command to show files and folders with color coding. You can change the colors using the LSCOLORS environment variable, but the configuration string is just too damn arcane for me, so I skip it and go with the defaults. See the man page for ls to learn more about the color coding and the options.

Finally, there’s the PATH environment variable. The items, separated by a colon, are directories in the file system that contain the commands I use. By default, these paths are “/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin”.

I have a bunch of command line tools and scripts that I’ve accumulated over the years and they are all in the bin directory of my user folder. I also use MacPorts, so the /opt binaries get added to the end of my path.

Note: I’ve added a backslash to the end of the first line that contains the PATH definition. This is the “line continuation character” and can be used to break up lines that are too long for your editor, terminal or the narrow column of this web page :-)

If you use Xcode, you might find it handy to put the /Tools folder in your PATH. If you want command line access to the same versions of gcc, git, svn and other tools that Xcode uses, you can substitute /Tools with /usr/bin. Since I prefer to use MacPorts to get specific versions of the tools I use, I generally don’t need access to Xcode’s binaries. If I do, it’s easy enough to do this:

$ $XCODE/usr/bin/gcc --version

Note that since the .profile is only read when the shell starts, the XCODE environment variable won’t be correct if you run xcode-select after the shell is started. This is usually only a problem if you’re juggling multiple versions of Xcode and can easily be solved by closing the Terminal window and opening another.

Shell Scripts

As I said before, I populate my ~/bin directory with a bunch of useful tools that I’ve developed over the years.

Two of my favorites scripts in that repository are psc and opensim.


The first script is an oldie but a goodie (note the shebangtcsh!). I named it psc:

set cmd = 'ps axo pid,ppid,user,command'
if ("$1" == "") then
  $cmd | grep "$1" | grep -v "grep $1" | grep -v "bin/psc"

It’s a wrapper for ps that shows all of the process’ command and can target it for a specific app. For example:

$ psc Xcode
15401   150 CHOCK  … Xcode -psn_0_26982842
15476 15401 CHOCK  … XcodeDeviceMonitor --bonjour _15401

Now you’ll see all of the processes that have “Xcode” in the command. The process ids are also shown, so you can kill them, of course.

I also find that seeing the parent process id, which is shown in the second column, is important in these days where everything seems to be a child of launchd. It’s also helpful for finding things like XPC processes and other items that get executed from an app’s bundle.


If you’re an iOS developer, the opensim script is very helpful for finding your app’s sandbox folder in the Simulator. Here’s the script:


if [ -z "$1" ]; then
  echo “usage: $0  [ Preferences |  ]”
  base=~/Library/Application Support/iPhone Simulator/
  app=`ls -1td “$base/”*”/$apps/”*”/$” | head -1`
  if [ -n "$app" ]; then
    dir=`dirname “$app”`
    if [ "$2" = "Preferences" ]; then
      open “$dir/Library/Preferences”
      open “$dir/Documents/$2″

And here’s how to use it.

Version Control

Most of the other stuff in my ~/bin directory are utilities that help with my own workflow (managing servers, testing, backups, etc.)

One of the reasons I prefer to keep these kinds of tools in separate files (as opposed to aliases) is that they’re easier to manage from a version control repository. When I set up a new machine, I just go to the User’s home folder and checkout my bin directory. Fin.

Another reason that I use files instead of aliases is that they’re easier to search. For example, if I’m looking for a script that I wrote three years ago and I know it uses ssh, I just do this to jog my memory:

$ grep -li "ssh" ~/bin/*

Shell Tricks

Once you get your shell setup, it’s time to learn some of its most useful tricks.


The shell remembers everything you type. With a few simple characters, you can avoid retyping. The first two are !! which repeats the last command entered. I’m ashamed to admit that I do this every time I edit my /etc/hosts file:

$ vi /etc/hosts

“Oh crap, it’s read only.”

$ sudo !!

If you just want to reuse the last item in the command, which is typically a file name, then you can use !$:

$ cat TODO.txt
$ rm !$

Merlin has Inbox Zero. I HAVE TODO LIST ZERO

The history command will give you a list of the last 500 things you’ve typed:

$ history

That’s a lot of stuff, huh?

If you want to execute one of those commands again, just use an exclamation point followed by the sequence number. For example:

$ !100

Because the history list is so long, there are a couple of strategies for dealing with its size. The first is to do a simple search:

$ history | grep “BLOW”

Alternatively, you can use use less to search the list interactively:

$ history | less

There are a few other ways to reuse your shell history: do a search for “Event Designators” in the bash manual page.

Command Substitution

Another powerful feature of the shell is command substitution. When you place back ticks around a command, the results are used in the current command. For example:

$ xcode-select --print-path

$ cd `xcode-select --print-path`

$ pwd

Not only does this save typing, but it makes sure that you’re in the right place if you have multiple versions of Xcode installed. Bring on the beta releases!

Note: This is essentially how I setup the $XCODE environment variable in my .profile earlier.

Command substitution isn’t limited to just paths, either. Try this:

$ `which svn` --version

Desktop Integration

As software developers, we live in two worlds: the command line and the desktop. Not surprisingly, Apple has provided several tools that helps you unite these two worlds. The following are some of my favorites.


The first tool is called open. Both the name and a manual page that says “opens files and directories” belie its true power.

For example, say you want to open the shell’s current directory in a Finder folder. It’s this easy:

$ open .

If you want to reveal the enclosing folder for a directory or file, then just add the -R parameter:

$ open -R .

The default is to open a file item in the Finder, but you can also specify and application using the -a parameter. Here are a few examples:

$ open -a Terminal /Users/CHOCK/DOT_PRON

$ open -a Preview Default.png 

$ open -a TextEdit README.txt


Since this is a Unix tool, you can open the standard input in an application using the -f parameter:

$ cal | open -a TextEdit -f

The open command assumes that text will be used for standard input, so if you want to use it with other file types, you’ll need to use a temporary file. For example:

$ temp="/tmp/CHOCKIFY.HTML"; 
    curl -s | 
    tr "[a-z]" "[A-Z]"  $temp; 
    open -a Safari $temp

And since URLs are just another kind of file in OS X, you can use open to download them:

$ open

Or open your favorite network share points:

$ open afp://

You can also include a “username:password@” before the host name, but I recommend that you just let the Finder prompt you for the login information. Remember that your shell history contains the last 500 things you’ve typed, so protect yourself from some smart-ass like me that sits down in front of your Terminal and does this:

$ history | grep -e "ssh" -e "afp"

Drag Drop

As you can see, there are many different ways to go from the command line to the desktop. Conversely, going from the desktop to the command line is remarkably simple: you can drag any file or folder from a Finder window into the Terminal window and you get it’s path.



As developers, we live and die by our clipboard. Code and data moves between different contexts all day long thanks to Cocoa’s NSPasteboard. It should not be surprising that pbcopy and pbpaste are simple and powerful integration points at the command line.

Want to see the contents of your clipboard or put it in a file? Here you go:

$ echo `pbpaste`

$ pbpaste  /tmp/ELMERS.TXT

Going the other direction, you can copy the current month’s calendar to the clipboard and then paste it into another app:

$ cal | pbcopy



Most apps have preferences that are managed by NSUserDefaults. You can easily view or modify these settings from the command line using the defaults command.

Let’s start by looking at all the preferences for Xcode:

$ defaults read

Looks more like a database than preferences, right? Every setting is shown with its key and associated value.

If you’re interested in a single key, you can use it to limit the output. Say you want to get the information of every iOS device you’ve used in the Xcode Device Manager:

$ defaults read DVTSavediPhoneDevices

You can also specify a new value after the key. Set the one true tab width using:

$ defaults write DVTTextIndentTabWidth 4

Sometimes settings are used to persist data across launches. There are often cases where you want to get rid of things without wiping out the entire database. Here’s an example of removing the recent text completions in Xcode:

$ defaults delete 

Note that using the defaults command is much safer than editing a .plist file in ~/Library/Preferences by hand. Beginning in Mavericks, there is a cfprefsd daemon that manages and caches updates to these files. If you use a text editor to modify the file directly, both you and your app will get confused when the changes don’t propagate through the cache managed by the daemon.


You know your Mac has pretty great speech synthesis built-in. But did you know is available from the command line as well? Say hello to:

$ say hello

Do you ever find yourself with a process that takes a really long time to run? Something like realigning the cores on Marco’s new Mac Pro:

$ ./realign_core_processing_units ; say "cores realigned"

When the script finishes running after a few hours, you’ll hear “cores realigned”. Even if you’re not looking at the Terminal window, you’ll immediately know why everything feels so much snappier.

Now imagine the fun you can have when you ssh into a designer’s Mac with an open Skype microphone nearby:

$ say “I’m getting a tingling sensation in my hard drive.”

Hugs and kisses, Louie.


Speaking of designers, one of the best ways to communicate with them is through pictures. The screencapture tool let’s you do some things you can’t do using the Command-Shift-3 and Command-Shift-4 keys in the Finder.

If you need to take a screenshot of the entire screen and want to put it in a new email message, just do this:

$ screencapture -C -M /tmp/image.png

Sometimes you need to get things setup before taking the screenshot (opening menus, for example.) So just tell screencapture to wait ten seconds:

$ screencapture -T 10 -P /tmp/image.png

The -P option tells the tool to open the captured image in the Preview app, too. That’s often helpful to make sure you got everything you wanted in the shot.

If you’re going to paste the image into an image editor, use the -c option to put the shot on the clipboard:

$ screencapture -c

If you’re interested in getting just a portion of the screen, use the -s to select a portion using the mouse. You can also specify different output formats with -t option:

$ screencapture -s -t pdf /tmp/image.pdf

As you’ve seen, this tool has a lot of options, so I usually refresh my memory with the built-in help:

$ screencapture -h

mdls and mdfind

Spotlight search on the Desktop has become an essential tool for developers. We find code, documentation, messages and all kinds of information that’s related to our projects using Command-space and a simple text field. Would it surprise you to know that you can do more complex searches of the same dataset using the command line?

To give you and idea of how much searchable data is available for Spotlight, use the following command on one of your Mac’s hard drives:

$ sudo du -sh /Volumes/ELEVEN/.Spotlight-V100
1.6G	/Volumes/ELEVEN/.Spotlight-V100

The Spotlight index on my MacBook Air’s 256 GB drive is a whopping 1.6 GB.

So how do we dig around in this huge database? The first step is to get an idea of what the key/value pairs in the index look like. The mdls tool is the way to do this:

$ mdls ~/Documents

Try this command on a few document files as well (folders and files have different metadata.) The keys usually start with “kMD”. For example, an item’s name in the filesystem is stored with the kMDItemFSName key. The values are everything after the equal sign.

Now that you’ve got an idea of what keys and values can be used, let’s do some searching!

At its simplest, mdfind is a command line version of the Spotlight search in the upper-right corner of your menubar:

$ mdfind -interpret "BOOGIE WOOGIE"


This facility starts to get more powerful when you start using keys and values in the query. For example, this is a simple query that lists all items, both files and folders, that have a custom icon:

$ mdfind "kMDItemFSHasCustomIcon == 1"

(Remember this command when we start digging around in resource forks below.)

These queries can be combined with || or to form more expressive searches. Here’s how you’d find all Photoshop files that have a layer named “Layer 1”:

$ mdfind "kMDItemLayerNames == 'Layer 1' 
     kMDItemContentType == 'com.adobe.photoshop-image'"

I found out that PSD files have a kMDItemLayerNames key by using mdls on a sample file. This is generally faster than Reading The Fine Manual.

The search through the metadata can also be limited to items that are in a specific folder. For example, here’s how you’d find all the items in a Project folder that have been localized in Japanese:

$ mdfind "kMDItemLanguages == 'Japanese'" 
    -onlyin ~/Projects

Another great feature of mdfind is that you can pipe it’s output to xargs to run other shell commands on the results. For example, the following command will find all apps that are PowerPC-only and send them to xargs (note the usage of -0 in both commands to delimit the listed items.) In turn, xargs will use du to show how much space each app uses and a grand total at the end:

$ mdfind "kMDItemExecutableArchitectures == 'ppc' 
     kMDItemExecutableArchitectures != i386 
     kMDItemExecutableArchitectures != x86_64" -0 
    | xargs -0 du -sh -c

If you’re like me and have been migrating your Applications folder for years, there are a surprisingly large number of items that are wasting space. Doing the actual work of removing these items is left an exercise for the reader.

(Hint: rm is a shell command just like du. Be careful and backup first.)

Application Integration

It’s incredibly handy to control your desktop apps using the shell. Since AppleScript has always been the best way to control apps, it makes sense that there would be a command line tool. The osascript tool is one the Swiss Army would love.

Want to change the volume of your speakers with a shell script? Go for it:

$ osascript -e 'set volume output muted true'
$ sudo osascript -e 'set volume 10'

You can also tell the Finder to do common chores:

$ osascript -e 'tell application "Finder" to empty trash'

It’s the end of a long work day and you have a script that needs to run for a few hours. It would be really nice to sleep your Mac after that job is finished, wouldn’t it?

$ ./realign_core_processing_units; 
    osascript -e 'tell application "Finder" to sleep'

It’s kind of a shame that Marco won’t be there to see how snappy the realigned cores are, but hey, whatever.

AppleScript from your shell can also relieve frustration:

$ osascript -e 'tell application "Messages" to quit'

Note that this is a more friendly way to do it than killall Messages since the “quit” Apple Event gives the app a chance to shutdown gracefully. Not that Messages would really notice.

If you want to switch from the Terminal to another application, use this:

$ osascript -e 'tell app "Safari" to activate'

You can also use AppleScript to get information about apps:

$ osascript -e 'id of app "Xcode"'

$ defaults read `osascript -e 'id of app "Xcode"'`

Or get the properties of any file or folder:

$ osascript -s s -e 'tell application "Finder" ¬
    to get properties of item POSIX file ¬
    "/tmp/HOTT.png"'  /tmp/HOTTNESS.txt

$ cat /tmp/HOTTNESS.txt 
{class:document file, name:"HOTT.png", index:2, displayed name:"HOTT.png", name extension:"png", extension hidden:false, container:folder "tmp" of item "private" of startup disk of application "Finder", disk:startup disk of application "Finder", position:{-1, -1}, desktop position:missing value, bounds:{-33, -33, 31, 31}, kind:"Portable Network Graphics image", label index:0, locked:false, description:missing value, comment:"", size:49547, physical size:53248, creation date:date "Saturday, July 5, 2014 12:01:52 PM", modification date:date "Saturday, July 5, 2014 12:01:52 PM", icon:missing value, URL:"file://localhost/private/tmp/HOTT.png", owner:"CHOCK", group:"root", owner privileges:read write, group privileges:read only, everyones privileges:read only, file type:missing value, creator type:missing value, stationery:false, product version:"", version:""}

Note the use of ¬ as AppleScript’s line continuation character. You can get this character into your editor by using Option-L on your keyboard.

File Tools

We’re developers. We love files. In more ways than ls, cat, and rm can ever know.

Let’s meet some new friends…


How often have you opened a Get Info window in the Finder just to know the dimensions of an image or other basic information about a file in a project? The Finder is fine, but you’re already at the command line, so just use file instead:

$ file Default.png
Default.png: PNG image data, 640 x 1136, 8-bit/color RGB, non-interlaced

Getting the file information from the command line is often faster if you’re dealing with a lot of different resources. For example, when your designer gives you a bunch of PNG files and you need to set sizes for them in code, here’s how you get all the non-2x dimensions:

$ cd Images
$ file *[^2][^x].png

Much quicker! And file can tell you things that the Finder can’t. Like the architectures supported in a binary:

$ cd /Applications/
$ file iTunes
iTunes: Mach-O universal binary with 2 architectures
iTunes (for architecture i386): Mach-O executable i386
iTunes (for architecture x86_64): Mach-O 64-bit executable x86_64

This is particularly helpful when you’re getting link errors for a .dylib library that’s in your project: file will probably tell you that an architecture you need is missing.

You can throw pretty much anything at file:

$ file Notes.rtf 
Notes.rtf: Rich Text Format data, version 1, ANSI

$ file Wallpaper.psd 
Wallpaper.psd: Adobe Photoshop Image, 744 x 1392, RGB, 3x 8-bit channels

$ file Rubylith.qtz 
Rubylith.qtz: Apple binary property list

$ file Colors.pdf 
Colors.pdf: PDF document, version 1.1

$ file xScope.entitlements 
xScope.entitlements: XML  document text

Note that it doesn’t know that .entitlements is an XML plist. But we’ve still got less to figure that out.

It’s amazing how many different types of files can be examined using this simple command. To get an idea:

$ ls /usr/share/file/magic

$ less /usr/share/file/magic/apple

If you ever run across a Newton package or some Applesoft BASIC, file has you covered.

Quick Look

As its name implies, Quick Look is an extremely fast way to check the contents of a project asset. Want to make sure your iOS app’s launch images are correct without leaving the command line?

$ qlmanage -p Default*.png

The arrow keys let you scroll through the images and a tap on the space bar gets you back at a command prompt. This is a much quicker and easier way than viewing the images with Preview or an open command—your hands never need to leave the keyboard!

To make this even easier, I have a short shell script named ql:

qlmanage -p "$@" 

I use it every day to check one or more files:

$ ql single.png multiple/*.{png,jpg}

Extended Attributes

While Spotlight’s index is an external source of metadata about files, the OS X file systems also support extended attributes. Like Spotlight’s metadata, these can be thought of as key/value pairs that are attached to any regular file or directory.

To view these attributes, you use the xattr utility:

$ xattr ~/Downloads/

Another way to get the same information is to use the -l@ option with ls.

The file that was downloaded has three extended attributes. But what’s in these attributes?

The simple answer is that you don’t know: the use of reverse domain names is the first hint that the data is application-specific. Apple clearly uses these attributes for their own internal needs. Additionally, any application can attach its own information to a file using this mechanism.

In practice though, you can usually figure out what’s being stored there using the -p and -lp options. For example, after you look at the contents of the attribute, it’s a pretty safe bet that is used to mark files that have just been downloaded:

$ xattr -p ~/Downloads/ 

In fact, if you want to skip the warnings from the Finder about the dangers of opening files, you can remove the quarantine attribute:

$ xattr -d ~/Downloads/ 

Since any data can be attached to the named attribute, sometimes you’ll see a bunch of hex values instead of text:

$ xattr -p 
62 70 6C 69 73 74 30 30 A1 01 33 41 B6 86 3B 6D
B2 87 5D 08 0A 00 00 00 00 00 00 01 01 00 00 00
00 00 00 00 02 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 13

If that happens, just use the -lp option instead:

$ xattr -lp 
00000000  62 70 6C 69 … B6 86 3B 6D  |bplist00..3A..;m
00000010  B2 87 5D 08 … 01 00 00 00  |..].............
00000020  00 00 00 00 … 00 00 00 00  |................
00000030  00 00 00 00 …              |.....

That “bplist” looks like a hint to the contents of the data. More about that in just a bit.

Extended attributes are pervasive: you’ll find that they’re working quietly behind the scenes to enable a lot of functionality we take for granted. For example, have you ever wondered why Xcode build folders never get backed up by Time Machine? Here’s your answer:

$ cd ~/Projects/CHOCKINATOR/build
$ xattr .

Resource Forks

If you’re a long-time Mac developer, you’ll remember the days of resource forks. They were kind of like the extended attributes shown above, but the attribute names were limited to four characters and the data being stored was only accessible on a Mac.

Some things never die: Mac files can still have a resource fork. These days, they’re mostly used to store custom icons that have been attached to a file or folder.

To access the resource fork of a file, just append “..namedfork/rsrc” to the path. For example:

$ ls -oh FERRET
-rw-r--r--@ 1 CHOCK     0B Apr 19  2010 FERRET
$ ls -oh FERRET/..namedfork/rsrc
-rw-r--r--  1 CHOCK   155K Apr 19  2010 FERRET/..namedfork/rsrc

The file has zero bytes of data, but the resource fork uses 155 KB. Let’s throw it at file and see what happens:

$ file FERRET/..namedfork/rsrc
FERRET/..namedfork/rsrc: MS Windows icon resource

Close. It’s actually a Mac ICNS resource. I’m pretty sure this is the first and only time I’ve seen file get it wrong.

If you’ve ever noticed a zero length file named “Icon?” in a folder that has a custom icon, it’s the same thing:

$ file Icon^M/..namedfork/rsrc
/..namedfork/rsrc: MS Windows icon resource

Note: to get the ^M in the name, type Control-V followed by Control-M. You can do this to get any control character into the shell’s edit buffer.

Internally, the resource forks are stored as extended attributes using If you try to access it with xattr you’ll get an “operation not permitted” error.

Finder Info

Another extended attribute that can contain useful information is This chunk of data contains, well, information for the Finder:

$ xattr HOTT.png
$ xattr -p HOTT.png
00 00 00 00 43 48 4F 4B 00 50 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00

Unless you’re me, you don’t recognize the extreme importance of 0x43, 0x48, 0x4F, and 0x4B. Luckily, there are some helpful tools in your Xcode Developer Tools folder. The first is GetFileInfo:

$ GetFileInfo HOTT.png 
file: "/private/tmp/HOTT.png"
type: ""
creator: "CHOK"
attributes: avbstclinMEdz
created: 01/21/2014 10:31:53
modified: 01/21/2014 10:31:53

The man page for GetFileInfo explains the attributes. The creator, and source of those four hex bytes mentioned above, needs no explanation.

To change the file info, you’d use SetFile. That being said, it’s probably easier to do it in the Finder’s Get Info panel:

$ osascript -e 'tell application "Finder" to ¬
    open information window ¬
    of item POSIX file "/tmp/HOTT.png"'

Fun Fact: These tools have been around since the days of MPW. If you don’t know what MPW means, consider yourself lucky. In fact, pretty much everything in $XCODE/Tools is a trip back in time. Do this if you remember them as the “good ol’ days”:

$ man Rez

Examining Binaries

A lot of the files we deal with are executable. Even if symbols have been stripped from the app, you can still infer a lot of information by looking at the null terminated strings present in the data:

$ cd /Applications/
$ strings TextEdit

Better yet, take it a step further and look at the information the Objective-C runtime uses in the Mach-O binary:

$ class-dump TextEdit

I think it’s safe to say that the iOS Jailbreak community would not formed as quickly as it did without this tool. It’s that important for figuring out what another developer did in their Objective-C code.

If you don’t have class-dump installed on your development machine, fix that right now. And while you’re there, don’t be a tightwad: put some money in Steve Nygard’s tip jar. He’s been supporting and maintaining this free tool for years.

Test and Debug

So far we’ve pretty much treated the shell as a static environment. But we all know there are lots of gears and levers moving behind the scenes. The shell is a great place to watch that activity.


When I’m in a shell and want to see the overall activity on a system, this is the first tool that I turn to:

$ top -d -u -s 10

The -d option shows deltas for network and paging activity (which is much more useful than total counts.) I typically run this command when some process is running amok: -u sorts the list by CPU usage. Using -s 10 updates the output every 10 seconds, minimizing the sampling affecting in the results. A longer sampling period also helps keep the listed processes from jumping around too much.

On the desktop, I always run the first Cocoa app I ever wrote: iPulse presents this information (and more!) in an interface that’s easy to read at a glance.


Have you ever tried to eject a disk and had the Finder tell you it can’t because there are still files still open? You’ve probably seen lsof used as a way to tell you what those files are:

$ lsof +d /Volumes/KILO

But as with most great Unix tools, lsof is not a one-trick pony.

For example, you can use lsof to see all the files an app has open. Let’s look at everything the Finder has open using the -c option:

$ lsof -c Finder

Yeah, that’s a lot of files. So use grep to narrow down the search:

$ lsof -c Finder | grep PrivateFrameworks

Hmmm… the large ArtFile.bin file in CoreUI.framework sure looks interesting.

And remember that Unix sockets are just another kind of file. So in addition to checking what normal files a process has open:

$ lsof -c Dropbox | grep "/Users/CHOCK"

You can also check what kind of network connections it has open and the state of those connections:

$ lsof -c Dropbox | grep "TCP"
$ lsof -c Dropbox | grep "LISTEN"

Taking the network theme a bit further, the -i parameter lets you see which processes have open connections on specific ports:

$ lsof -i :80
$ lsof -i :22

Want to see all the processes that are listening on a socket? Here you go:

$ lsof -i | grep LISTEN

Or ones that have established connections:

$ lsof -i | grep ESTABLISHED

Remember that sockets are often used for interprocess communication, so just because there’s a connection, it doesn’t mean that packets are leaving your Mac.

In general, lsof is a great tool for investigate processes that are unknown, poorly documented, or you’re just curious about:

$ psc securityd
   18     1 root           /usr/sbin/securityd -i
$ sudo lsof -p 18 | grep "/Applications"

You probably didn’t realize it at first, but you just got a list of all applications that are using the Keychain. Note that sudo is required because the securityd process runs with elevated privileges.

As you can see, there are a lot of options and parameters, so make sure to take a look at the lsof man page.


If you’re developing for Mac or iOS, you already know how damn useful Instruments is for tracking application behavior. DTrace is the framework that makes all that possible. Well, take a look at all the stuff in the shell that “uses DTrace”:

$ man -k dtrace

Whoa. That’s a lot of stuff. A shorter and more useful list can be had by just showing the “snoop” tools:

$ man -k snoop

As you saw above, lsof gives you a snapshot of the current state of processes and the files they have open. The “snoop” tools, on the other hand, show you the system as it changes state. For example, let’s use opensnoop to watch the system logging process as it opens files:

$ sudo opensnoop -p `cat /var/run/`

Every command that uses DTrace requires elevated permissions to retrieve the buffered data from the kernel. Get used to typing sudo.

Now, in another Terminal window, send something to the system log:

$ sudo syslog -s "CHOCK ME BABY"

The window running open snoop should show something like this:

  UID    PID COMM          FD PATH                 
    0     23 syslogd       15 /var/log/system.log  

Have you ever wondered how crackers find the super secret file you use to store serial numbers? Use the -n to specify a process by name:

$ sudo opensnoop -n

There you go. Good luck trying to outsmart the kernel with your “uncrackable” protection scheme.

If you’re interested in seeing which processes open a specific file, use -f to specify a path:

$ cd ~/Library/Keychains
$ sudo opensnoop -f CHOCK.keychain

You can also track file or disk activity with rwsnoop and iosnoop:

$ sudo rwsnoop -n Safari
$ sudo iosnoop -n Safari

As you saw with that long list of DTrace shell commands, there’s a lot more you can do with this facility. In fact, you can actually write code that gets compiled and run. Here’s a one-liner that shows when processes are spawned and exec’ed:

$ sudo dtrace -n 'proc:::exec-success { 
    printf("%d %s", timestamp, curpsinfo-pr_psargs); 

In fact, those “snoop” tools you just ran are just script wrappers around code written in the D language:

$ less `which opensnoop`

For more information and tips about using DTrace on Mac OS X, check out the DTrace site.


Another tool to watch what’s going on in the filesystem is fs_usage. This tool reports any activity for calls to like fopen(), fwrite(), getattrlist(), fsync(), etc.

For example, if you want to watch the Finder manipulate files:

$ sudo fs_usage Finder

When using this tool, remember that a “file system” doesn’t necessarily mean a disk is involved. The -f option lets you filter out network related events:

$ sudo fs_usage -f network Safari


If there was an award for the crappiest OS X man page, I’d have to award it to tccutil:

$ man tccutil

Only one service is listed in the man page: “AddressBook”. There are others, and now that you know how to poke around in binary files, you can find them:

$ strings /System/Library/PrivateFrameworks/TCC.framework/TCC 
    | grep kTCCService 

If you’re testing an app that requests for a user’s calendar information, you can reset the authorization by removing the “kTCCService” prefix from the list above:

$ sudo tccutil reset Calendar

Be careful when you’re using this tool: you’re resetting all apps in each service category, not just your own. I do not recommend running this command on a machine where you have apps already registered in System Preferences Security Privacy Privacy. Use a test machine or a clean VM image before you start blasting away at security settings.

It’s also worth noting that Accessibility is a special case. There is a SQLite database that contains the bundle ids for the applications that have been granted access. And you can modify that database:

$ sudo sqlite3 
    /Library/Application Support/
sqlite .schema access
sqlite select * from access;
sqlite delete from access where client like '%Xcode';

Note that these commands have no effect on whether you see the authorization prompts the first time a user accesses the service from your app. After a new app been presented with the alert dialog, an entry is added to the defaults:

$ defaults read

You can hunt for specific paths and bundle IDs, or just use the nuclear option:

$ defaults delete

After clearing the warnings, you’ll need to restart your Mac and run the app again. Again, this is much more palatable if you’re working in a test environment on a dedicated machine or VM.

When your working with the above commands, remember that as far as security is concerned, you’ll be getting a bundle ID for Xcode when running under the debugger. In many cases, you want to get rid of settings for as well as ones for your own app.

The Internet

True story: the Internet existed on the command line before the Tim Berners-Lee ever thought about making a browser in the early ’90’s. First there was Telnet in 1969, then FTP in 1971, followed by Finger in 1977, and so on.

I think it’s pretty safe to say the command line will never die, but the best part is that your Terminal continues to be a tool that works great in our networked world.


Every time I upload a new version of our software, I check it using curl:

$ curl --head

This lets me see that a “200 OK” response is returned and that the number of bytes match my local copy.

Another great use for curl is to watch all data that gets transferred over a HTTP connection. And I mean ALL the data:

$ curl -v http://CHOCKLOCK.COM

The title of curl‘s man page is “transfer a URL.” That all encompassing description should be your first hint that the tool does a lot. And I mean a lot—it supports the DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, TELNET and TFTP protocols.

If you use URLs in your work, do yourself a favor and check out the man page.


Did you know that you can write code that runs in the OS X kernel and looks at every network packet at the data link layer? And that you can do it from the command line?

The tcpdump command is a user interface for Berkeley Packet Filters. This powerful tool can show you anything that’s happening on your network.

A simple example is to watch what’s happening when you access a web resource. Say you want to see all the HTTP data that passes on the network when you load Gruber’s latest post in your browser:

$ sudo tcpdump -s 0 -A 
    port 80 and host 

The -s parameter sets the amount of data you see in each packet: when it’s set to zero, you get everything. The -A parameter specifies that the data should be displayed in ASCII. You need to use sudo with tcpdump because you’re accessing the /dev/bpf* devices which have root-only permissions.

In conjunction with Internet Sharing, tcpdump can be a very powerful tool for iOS developers. After setting up Internet Sharing in your Mac’s System Preferences, you can then watch traffic that passes between the Ethernet and Wi-Fi interfaces of that shared connection. If your iPhone is connected over Wi-Fi to your Mac using the IP address, the following command will show all the requests it sends to Daring Fireball:

$ sudo tcpdump -s 0 -A port 80 
    and src 
    and dst 

If you were interested in packets going the other way, just swap the src and dst. Hopefully, you’ll see how this is useful when debugging network traffic between your iOS device and a REST service.

The syntax used to write the packet filter code is defined in the pcap-filter manual page (“pcap” is short for packet capture.) The filtering options in the expression include detecting different network protocols and examining the length and contents of the packets themselves.


Have you ever had a folder full of files that you’ve wanted to access through a web browser? You could setup Apache to do this by editing the httpd.conf file, or just enter the following command in the folder you want to access:

$ python -m SimpleHTTPServer 8000

Now, when you open http://localhost:8000 in browser, you’ll be able to traverse the directory structure. Note that the server runs until you use Control-C to get back to the shell prompt.

This can be a great tool when testing your code: the directory can contain static JSON files that include data for your app. Just update the endpoints in your code to use the localhost and you’re set.


Caches are designed to store information that’s used repeatedly. But when that data gets stale, you need a way to flush the cache. As developers, we’re often faced with DNS or other changes that affect our network stack.

When you find some piece of software misbehaving on the network, it’s often caused by the Directory Service cache being out-of-date. Luckily, there’s a simple command to flush the cache and get things back in working order:

$ dscacheutil -flushcache

If you’re having problems resolving IP addresses from domain names, you can restart the DNS server with:

$ sudo killall -HUP mDNSResponder

A similar command causes statistics and other diagnostic information to be logged to /var/log/system.log:

$ sudo killall -INFO mDNSResponder

Be prepared to sift through a lot of information!


Every Unix system has a simple tab-delimited database of service names and ports. The file is located in /etc/services. If you’re wondering what’s listening on port 69, do this:

$ grep "s69/" /etc/services
tftp             69/udp     # Trivial File Transfer
tftp             69/tcp     # Trivial File Transfer

You’ll also see that the TFTP service can use both TCP and UDP protocols.

Another handy search is if your looking for the standard ports for a protocol. Say you want to know which ports can be used for IMAP:

$ grep imap /etc/services
imap      143/udp  # Internet Message Access Protocol
imap      143/tcp  # Internet Message Access Protocol
imap3     220/udp  # Interactive Mail Access Protocol v3
imap3     220/tcp  # Interactive Mail Access Protocol v3
imap4-ssl 585/udp  # IMAP4+SSL (use 993 instead)
imap4-ssl 585/tcp  # IMAP4+SSL (use 993 instead)
imaps     993/udp  # imap4 protocol over TLS/SSL
imaps     993/tcp  # imap4 protocol over TLS/SSL

IP address

Need to know the WAN IP address you’ve been assigned outside your internal LAN?

$ curl -s

Loading the site in your browser shows a lot of other options. Be aware that information about the HTTP connection and HTML content aren’t available from the command line unless you explicitly set them as options for curl.


Data is never in the format you need it, is it? The shell’s notion of standard input and output has always made it great for doing data conversion. Here are some tools that you may not know about…


Want to convert text between .txt, .html, .rtf, .rtfd, .doc, .docx, .wordml, .odt and .webarchive formats? Look here:

$ man textutil

It’s pretty hard to read a binary property list, so convert it to XML or JSON. Or vice versa:

$ man plutil

Your designer gives you individual PNG files for your app icon, but your project needs an .icns file. Or vice versa:

$ man iconutil

Pretend like you’re in The Matrix:

$ hexdump -C Default.png

Or convert between hex and binary formats:

$ man xxd


How many ways do we need to compress data? In the world of the command line, I’m pretty sure the answer to that question is a number that approaches infinity. Regardless, here are a few of the most common formats and the tools used to view and extract the data from the command line.

If you have a ZIP file and you want to quickly view its contents, use the unzip command with the -l option:


Without the option, the ZIP file will be extracted to the current directory:


Similarly, a GZIP compressed TAR archive can be listed with:

$ tar tvfz feather.tgz

And extracted with:

$ tar xvfz feather.tgz

Fun fact: the name tar is short for “tape archive”. That gives you an idea of how long this piece of code, and resulting archives, have been around. I think it’s pretty cool that an archive you created in the 1970’s can still be used forty years later. How much code being written today will be able to say that?

Log files are often compressed with either GZIP or BZIP. A quick way to uncompress these files and view their contents is to use the “cat” variants and pipe the output to less:

$ gzcat les_paul.log.gz | less

$ bzcat /var/log/system.log.0.bz2 | less

Disk Utilities

If you’re looking for a disk utility, look no further than diskutil. This is essentially the command line version of the Disk Utility in your Applications Utilities folder.

The first command I use is list:

$ diskutil list
 #:                  TYPE              SIZE     IDENTIFIER
 0: GUID_partition_scheme             *251.0 GB disk0
 1:                   EFI              209.7 MB disk0s1
 2:             Apple_HFS KITTYS       250.1 GB disk0s2
 3:            Apple_Boot Recovery HD  650.0 MB disk0s3

This gives you the layout of all attached disks. The identifier column is typically used as a parameter for the other commands. For example, if you want to see all the information about your Apple_HFS partition, you’d use the identifier for that partition:

$ diskutil info disk0s2

A similar command gets information about the disk itself. In this case, we’re also using grep to quickly get the SMART status for the drive:

$ diskutil info disk0 | grep "SMART"
   SMART Status:             Verified

Some of the commands can use volume names instead of identifiers. For example, to eject a volume, you can use:

$ diskutil eject /Volumes/NOODLE

Be careful with diskutil since it also provides commands like secureErase and partitionDisk—they can quickly and permanently destroy your data. Expert users also use this command to setup and manage software RAID sets.

Another simple disk utility is the df (the short form for “disk free”.) To show how much free space you have on your boot volume, use:

$ df -h /


Did you know you can use Unicode in shell commands? Yep:

$ echo $'xf0x9fx8dx94'

Article source:

Batch Convert Text Files To Different Formats In Terminal

Batch Convert Text Files to Different Formats in Terminal

Converting texts between formats is a bit of a pain to do on a file-by-file basis, but thankfully MacIssues shows how to convert text files in bulk using a simple terminal command.

The tool you’ll use here is textutil, which can convert any single text file into another. Converting one file to another is pretty straight. For example, here’s how you’d convert a file to DOCX:

textutil -convert docx ~/Desktop/mypage.webarchive

You can do this in bulk by simply pointing to a specific folder. For example, if you wanted to convert a group of text files in a folder on your desktop from TXT to DOCX you’d type this into Terminal:

textutil -convert docx ~/Desktop/TextDocuments/*.txt

Of course, that’s just one of the more basic things textutil can do. You can also use it to bulk change fonts, font sizes and more.

Convert various text file formats in the OS X Terminal with textutil [MacIssues]

Article source:

Together we can make Scotland better.

Ahead of an expected visit from Ed Miliband to Scotland today (Friday), Scotland’s Health Secretary Alex Neil challenged the Labour Leader to come clean on the huge impact that NHS privatisation in England will have on Scotland.

Labour south of the border are warning in the strongest possible terms about the destruction of the NHS in England.

Mr Miliband’s Shadow Health Secretary has said that there are ‘no limits on the extent of privatisation in the NHS’, and that it will ‘eventually destroy everything that is precious about it.’

Importantly, Labour in England also echo the trade unions’ concern on charges being introduced for treatments, stating in their document ‘The Choice’ that “Under [The Tories] there is the prospect of more NHS services being charged for, and fewer services being provided free at the point of need.” In other words – private money replacing public funding.

Privatisation of the NHS south of the border has consequences for Scotland, as the NHS budget in England has a direct effect on Scotland’s devolved budget.

While the SNP has protected Scotland’s NHS in government, it is estimated that 50 per cent of the NHS in England will be run by private companies by 2020.

Every £10 cut from NHS England through austerity, privatisation, or patient charges reduces our budget by £1.

Commenting, Mr Neil said:

“Mr Miliband knows perfectly well how Scotland is funded under devolution – now he must come clean on the huge impact that NHS privatisation in England will have on this budget.

“It was of course Labour who opened the floodgates on privatisation when they were last in office.

“Now Ed Miliband and his colleagues are going round telling everyone that privatisation will eventually destroy everything that is precious about the NHS.

“Does he seriously expect the people of Scotland to believe that the objective of this privatisation is to increase public spending?

“Cuts to our NHS budget as a result of Westminster privatisation is a risk Scotland doesn’t need to take.

“We have protected the NHS budget in real terms, but only a Yes vote gives us the opportunity use Scotland’s immense wealth to ensure our public services are properly funded in the long term.”



Andy Burnham (17th September 2009) “With quality at its core, I think the NHS can finally move beyond the polarising debates of the last decade over private or public sector provision.”

Labour Shadow Health Secretary Andy Burnham (Radio 4, 29th July 2014):

 “…this government sees no limits on the extend of privatisation in the NHS.”

“…if we allow the continued advance of the market into the NHS it will eventually destroy everything that’s precious about it.”

Dr  Philippa Whitford, a consultant breast surgeon said earlier this year that  “In five years, England will not have an NHS and in 10 years, if we vote no, neither will we.”(WfI Video, Speech, 6 May 2014,


Article source:

Receive our daily news briefing or weekly bulletin

At the beginning of the month, after seventeen years in local government, the last eight spent in Oxfordshire, I handed in my pass, laptop and phone and left the fourth floor for the last time. It’s probably been the most difficult work decision of my life. I loved my job (managing a brilliant quality and contracts team for social care services). I loved the many colleagues I had in and out of the Council and the people with learning disabilities and their families I worked with. I loved being part of a Social Services Department with values and decency and vision. But, in recent years the work has become increasingly difficult. The reason – perhaps known only to insiders, but gradually making it onto the media recently ( – is the horrific level of funding cuts imposed on local government since 2010.

It is no coincidence that the last time I left a local authority, in 1991, I felt the same weight of pressure of trying to deliver the impossible task of providing good quality social care at a time of vicious cuts. I thought then that I would never return. But in 1997, in the optimistic days of the first New Labour administration, I couldn’t resist the job of joint commissioner of learning disability services in Camden, making combined health and social care a reality. It was an exciting time to work in local government, as positive initiative after positive initiative flowed from Downing Street.

“Valuing People” (…) put the idea of person centred services at the heart of learning disabilities services. “SureStart”, (…) piloted by a colleague of mine, demonstrated the value of preventative services for families of children suffering from deprivation. The Mental Capacity Act ( gave us a framework for supporting people with limited capacity the opportunity to maximise control of their important decisions.

For me, the crowning glory was the concept of Self-Directed Support, (…) which has become national policy, thanks to the fine work of my good friend and colleague Dr Simon Duffy, at the NGO In Control and subsequently at the Centre for Welfare Reform – with which Ekklesia has considerable fellow-feeling and opportunity to collaborate.

It wasn’t perfect. In the last seventeen years I have witnessed more than my fair share of things going wrong, safeguarding incidents and complaints. I’ve seen people’s services break down, and professionals like myself struggle to fix it. I’ve seen service users and carers get a raw deal and organisations ignore their rights. But I have also seen remarkable growth. Service user and carer groups are now an established part of the landscape. Their right to speak has been integrated into council decision making processes, through Partnership Boards, consultations, and regular feedback groups.

I have also seen the closure of all the learning disability hospitals (aside from assessment and treatment services) and the rise of supported living that enables people with learning disabilities to be tenants within their own communities. I’ve watched as people with learning disabilities have become more visible in drama, sport and politics. And I’ve been proud to be part of this revolution. I thought, naively, that it would go on forever. I was wrong.

The warning signs were there, back in 2008, at the time of the bank crash. As Labour panicked and decided to bolster the bankers uncritically, incoming Chancellor George Osborne took to the media to decry the wasteful public sector (conveniently ignoring the deregulation of the financial system that played a major role in the crash). It was a clever move and led to the big pre- and post-election lie – that the previous government broke the country, and only austerity could fix it.

Labour was not immune to such rhetoric either. In the run up to the last election, the local government settlement was nowhere near as generous as it had been. Year on year, we were being asked to do a little bit more with a little bit less. And we did. In Oxfordshire, we worked hard with our provider organisations to identify better, cheaper ways of doing things that didn’t impact on quality. We merged our back office functions, closed down buildings and, to my mind, made our working conditions a lot worse through the introduction of hot-desking and open plan offices in a way that lacked real thought or understanding. It was hard. Sometimes service users, carers and providers disagreed with our assessment of how to make the money spread further. Personally, I found sharing a floor with a hundred people immensely stressful. But there was a sense of common values, common vision, and an understanding we had to make every penny count, if we wanted to ensure the best services for the people of Oxfordshire.

Then, in 2010, just before the election, I attended a briefing for managers from our finance team. The message was stark. Bad times were coming. A 25 per cent cut was predicted if Labour came back to power and a staggering 28 per cent with a Conservative government. This was far worse than anything every contemplated at the height of Thatcherism. We had anticipated some of this with the savings plans we had in place, but we were going to have to make some unpalatable decisions – and it wasn’t going to get any easier.

When the UK Coalition Government was first formed, there was a sense of calm across the country. The Conservatives were very clever in comparing the national economy to household finances; something that Ann Pettifor and other political economists have shown to be a highly deceptive analogy. This paved the way for the idea that, individual or country, if you are in deficit you have to make cuts in your outgoings (irrespective of the impact on overall debt, productivity, investment and so on, when it comes to the real economy).

The Liberal Democrats were only too quick to join this consensus. The soothing message that “we’re all in this together” was reassuring. People were fed up with Labour and everyone wanted a change. But, for those of us working in the public sector, we realised that Osborne’s first budget heralded disaster. It was like being on the top of a cliff and seeing a tsunami coming, but being too far away from the people on the beach to shout a warning.

The last four years have been as brutal as we feared. In all my thirty years in social care I have never seen anything like it. Provider organisations have restructured, found ways to involve non-paid carers, merged, re-graded salaries and embraced zero hours contracts to cut their hourly rates (some back to pre-1997 levels). In the Council, we’ve restructured, downgraded some posts, increased the span of others. Everyone I worked with is doing a more complex job, with more responsibility, for less pay. The result? Stressed staff, stretched resources, an increasing risk of things going wrong, and the worry of the backlash that might follow. Moreover, there’s worse to come. The second wave of a further 10 per cent of cuts will hit Oxfordshire over the next three years, and there’s a third and possible fourth on their way – though at least, this time round, the people on the beach have a lot more awareness, thanks to the work of determined and focused campaigns such as The Hardest Hit, the War on Welfare (WoW) Petition group and the National Health Action Party.

It has been a tough time. I have survived thus far because Oxfordshire has not been quite as badly hit as other local authorities. It has also taken an ethical approach to reduced funding, aiming to preserve as much social care as it can. Together, my colleagues and I have worked hard to deliver on that, and I am proud that we’ve managed to do that in an inclusive way, rather than imposing top down solutions on people. But I came to the point where I realised I could not keep doing this and stay healthy and sane. Every day I can see austerity impacting on the whole of society and making things worse.

Welfare, education and the NHS are all being hit by the same cuts, driven, not by a desire to improve the national debt (which is now significantly worse than 2010 but by a poisonous ideology that private profit must come before common good. So I decided that the time had come to move somewhere, where I can be in a position to challenge that ideology. I have joined Ekklesia as the new Chief Operating Officer, because I can see what an important role a think-tank linking good quality research and ideas to civic action, political change and a positive role for churches and other NGOs can play in influencing mainstream politics. That is what I want to be part of.

I hope that this move will help me be part of the growing resistance to flawed austerity models and the generation of viable, humane alternatives. Government has a duty to promote the common good. I am desperately sad to have left my friends and colleagues behind, but it is my hope that in moving to Ekklesia, I can be part of a larger, creative conversation that will help change the direction we are taking in Britain and beyond. Hopefully that will mean a better tomorrow for my wonderful colleagues in local government and for the people who rely on their essential services.


© Virginia Moffatt is Chief Operating Officer of Ekklesia. She has a strong background in organisational management and has worked in social care for many years. She is an active advocate for peace, justice and inclusive welfare, and is a published writer.

Article source:

Holy See Reminds UN That Family Is Good for Individuals

Holy See Reminds UN That Family Is Good for Individuals

The family “continually exhibits a vigour much greater than that of the many forces that have tried to eliminate it as a relic of the past, or an obstacle to the emancipation of the individual, or to the creation of a freer, egalitarian and happy society”

June 25, 2014

| 78 hits

Here is the address given by Archbishop Silvano Tomasi, permanent observer of the Holy See to the United Nations and Other International Organizations in Geneva, at the 26th Session of the Human Rights Council on Item No. 8- General Debate in Geneva on Tuesday, 24 June 2014.

* * *

Mr. President,
My Delegation supports the importance given by the United Nations to the twentieth anniversary observance of the International Year of the Family. This significant event was recently highlighted in a special way, on 15 May 2014, during the International Day of Families, under the theme: “Families Matter for the Achievement of Development Goals”.  Surely, the choice of this theme had a strong relationship to Resolution 2012/10, adopted by ECOSOC that stressed the need “for undertaking concerted actions to strengthen family-centred policies and programs as part of an integrated, comprehensive approach to development”; and that invited States, civil society organizations and academic institutions “to continue providing information on their activities in support of the objectives of and preparations for the twentieth anniversary.”

This Council is well aware, Mr. President, of the strong debates held in this very chamber about the nature and definition of the family. Such discussions often lead States to conclude that the family is more of a problem than a resource to society. Even the United Nations materials prepared for the observance of this Anniversary Year stated: “Owing to rapid socio-economic and demographic transformations, families find it more and more difficult to fulfil their numerous responsibilities.”(1) My Delegation believes that despite past or even current challenges, the family, in fact, is the fundamental unit of human society. It continually exhibits a vigour much greater than that of the many forces that have tried to eliminate it as a relic of the past, or an obstacle to the emancipation of the individual, or to the creation of a freer, egalitarian and happy society.

The family and society, which are mutually linked by vital and organic bonds, have complementary functions in the defence and advancement of the good of every person and of humanity.(2) The dignity and rights of the individual are not diminished by the attention given to the family. On the contrary, most people find unique protection, nurture, and dynamic energy from their membership in a strong and healthy family founded upon marriage between a man and a woman. Moreover, ample evidence has demonstrated that the best interest of the child is assured in a harmonious family environment in which the education and formation of children develop within the context of lived experience with both male and female parental role models.

The family is the fundamental cell of society where the generations meet, love, educate, and support each other, and pass on the gift of life, “where we learn to live with others despite our differences and to belong to one another.”(3) This understanding of the family has been embraced throughout history by all cultures. Thus, with good reason the Universal Declaration of Human Rights recognized unique, profound, and uncompromising rights and duties for the family founded on marriage between a man and a woman, by declaring as follows: “(1) Men and women of full age, without any limitation due to race, nationality or religion, have the right to marry and to found a family.

They are entitled to equal rights as to marriage, during marriage and at its dissolution. (2) Marriage shall be entered into only with the free and full consent of the intending spouses. (3) The family is the natural and fundamental group unit of society and is entitled to protection by society and the State.”

Mr. President, during this historic anniversary observance, the Holy See Delegation firmly maintains that the family is a whole and integral unit, which should not be divided or marginalized. The family and marriage need to be defended and promoted not only by the State but also by the whole of society. Both require the decisive commitment of every person because it is starting from the family and marriage that a complete answer can be given to the challenges of the present and the risks of the future.(4) The way forward is indicated in the fundamental human rights and related conventions that ensure the universality of these rights and whose binding value need to be preserved and promoted by the International Community.


2. Pontifical Council for the Family, “Charter on the Rights of the Family,” 1983,file:///Users/BobNewMBP/Documents/Pontifical%20Council%20for%20the%20Family/Charter%20of%20the%20Rights%20of%20the%20Family,%2022%20October%201983%C2%A0.webarchive

3. Pope Francis, Evangelii Gaudium, 24 Novemer 2013, #66,

4.Pontifical Council for the Family, “The Family and Human Rights,” 2000 

Article source:

Crowds turn out to object to Darlington housing plans

Crowds turn out to object to Darlington housing plans

RESIDENTS CONCERN: A consultation meeting about development at Bellburn Lane field. PICTURE: Stuart Boulton. (7302146)

CROWDS of people turned out to have their say on plans to build more than 6,000 homes in Darlington.

Residents potentially affected by Darlington Borough Council’s Making and Growing Places plan turned out in their droves to a consultation event held at St Mary’s Community Centre in Cockerton today (Wednesday, June 18).

The event was staged to garner opinions on a revised draft policy prepared following an initial consultation period held last summer.

Many residents voiced their objections to the plans, which could see homes built on green spaces throughout the town.

Representatives from The Friends of Cocker Beck rallied against proposals to build close to the conservation area.

Jan Needham said: “We’ve just got planning permission to build a pond and now they’re going to dump buildings 100 metres from it.

“We’ve been doing great work down there and it’s just obscene.”

Others were there to register their dismay at plans to build on a well-used field close to Bellburn Lane.

Bev Swainston said: “Every time I’ve been out on that field it’s being used. It’s the only green area around and we don’t want it building on.”

Hazel Neasham, head of housing for the council, said: “The objections are what you’d expect from a community used to a piece of green space and concerned that it’s going to disappear.

“Fifty years ago, Branksome was farm land and people were probably saying don’t build there.

“Any town that wants to grow needs to grow properties.

“If we want to attract investment we need places for people to live – we’ve also got an ageing population and young people wanting to live independently.

“Any new housing will cause concern but we need to improve – this development will allow us to do that.”

Another consultation event will be held at St Andrew’s Church Hall in Haughton le Skerne tomorrow from 5pm.

Showbiz news
  • Treatment for Selena Gomez stalker

  • Elgort: Woodley and I have a bond

  • Hairy Biker Si off road for illness

  • Campbell’s wife defends care move

  • Songwriter Gerry Goffin dies at 75

  • Lindsay Lohan takes West End role

Comments (56)

Please log in to enable comment sorting

Article source:

Legalized Marijuana Makes Drugged-Driving Study High Priority


Whether you think the grass is always greener on the side where weed is allowed or fear the whole country will go to pot, legalized recreational marijuana has taken root in the U.S. As it spreads to more states, questions about what a legalized America will look like abound, many of which will only be answered through trial, error and experience. One such issue — how to handle the projected increase in “drugged driving” — is about as hazy as a ’90s college dorm room littered with black-light posters and Phish CDs.

Planned Safety Tech Would Prevent Drunken Drivers From Starting Car

Two states already have legalized the recreational use of marijuana — Colorado and Washington — and more than a dozen states are poised to follow suit by 2016. That’s in addition to 22 states and the District of Columbia, where the drug already has been OK’d for medical purposes or decriminalized.

According to anti-drunken-driving advocacy group Mothers Against Drunk Driving, alcohol-related driving deaths have declined by half since 1980, but drunken-driving accidents cost the U.S. about $132 billion a year. What’s worse, on average 1 in 3 people will be involved in an alcohol-related crash in their lifetime and every day 28 people die in drunken-driving accidents. How will marijuana-related driving accidents impact these numbers?


It’s difficult to quantify the direct role marijuana plays in car crashes, and apples-to-apples comparisons to alcohol aren’t available. As notes, “many accidents are caused by people using marijuana in conjunction with other drugs, or in combination with alcohol.” Still, a National Highway Traffic Safety Administration study found that 4 percent of drivers were high during the day and more than 6 percent at night, and that nighttime figure more than doubled on weekends. Moreover, Columbia University researchers performing a toxicology examination of nearly 24,000 driving fatalities concluded that marijuana contributed to 12 percent of traffic deaths in 2010, tripled from a decade earlier.

NHTSA studies have found drugged driving to be particularly prevalent among younger motorists. One in eight high school seniors responding to a 2010 survey admitted to driving after smoking marijuana. Nearly a quarter of drivers killed in drug-related car crashes were younger than 25. Likewise, nearly half of fatally injured drivers who tested positive for marijuana were younger than 25.

But people are and have been smoking marijuana, regardless of its legal status, so will incentives such as increased ease of access, decreased risk of trouble with the law and a destigmatizing of the drug’s users translate to more people stoned behind the wheel? No one knows that for sure, but Colorado has seen a spike in driving fatalities in which marijuana alone was involved, according to The trend started in 2009 — the year medical marijuana dispensaries were effectively legalized at the state level, starting the so-called Green Rush there — and remained stable through 2013, the year after recreational prohibition ended in Colorado.

To combat what it calls “the growing problem of drugged driving,” the federal Office of National Drug Control Policy’s National Drug Control Strategy includes a goal of reducing drugged driving by 10 percent by making prevention a “national priority on par with preventing drunk driving.” The strategy calls for:

  • Encouraging states to adopt “per se” laws prohibiting any trace of the drug in a driver’s system while in control of a vehicle, even absent other evidence of impairment
  • Collecting further data on drugged driving
  • Public education
  • Law-enforcement training for identifying drugged drivers
  • Standardized screening methods for drug-testing labs for use in detecting the presence of drugs

Complicating matters of crafting and enforcing drugged-driving laws is limited study of the effects of smoking marijuana specifically on operating a motor vehicle. Soon that won’t be the case. NHTSA and the National Institute on Drug Abuse are now in the final months of a three-year, half-million-dollar cooperative study to determine the impact of inhaled marijuana on driving performance. Tests observe participants who ingest a low dose of THC (the active drug in marijuana), a high dose and a placebo to assess the effects on performance, decision-making, motor control, risk-taking behavior and divided-attention tasks.

The study is being performed using what NHTSA calls “the world’s most advanced driving simulator,” the University of Iowa’s National Advanced Driving Simulator, which was previously used to study the effects of alcohol on driving. It’s the first time NADS is being used to study the effects of an illicit substance, though researchers hope it will help clear the air on the marijuana issue.

“The mixed results from previous cannabis-dosed driving studies have demonstrated that its effects on driving can be more difficult to detect than the effects of alcohol,” NHTSA stated. “The NADS, a more sensitive data collection tool, is capable of detecting the more subtle changes in driving behavior of cannabis-dosed participants.”

From a law-enforcement perspective, aside from garden-variety physical signs that a motorist is intoxicated it’s difficult for an officer to determine on the spot whether someone has specifically used marijuana, and no reliable Breathalyzer-style test for the drug yet exists. Still, the law doesn’t distinguish between drunk and drugged, and substance-impaired drivers can still be charged with driving under the influence.

“A DUI is a DUI,” says, “and toxic to a driving record and car insurance rates either way.” photo illustration by Paul Dolan; photos by Dario Lo Presti/iStock/Thinkstock, Chad Baker/Jason Reed/Ryan McVay/Photodisc/Thinkstock, defun/iStock/Thinkstock and itayuri/iStock/Thinkstock

Article source:

Briefly: New Vue 2014.5 adds Exchange Area, DevonTechnologies' updates

E-on Software releases VUE 2014.5 and its new Exchange Area for 3D files

E-on Software has made two additional announcements this week, including the release of Vue 2014.5 xStream and Infinite, as well as the opening of Cornucopia3D’s new File Exchange Area. Vue 2014.5 xStream and Infinite are the latest production-driven versions for the creation, animation and rendering of natural 3D environments, designed for graphics professionals.

Improved support for Plant Factory vegetation has been added, speeding up the rendering of plants by up to 20 times. Realistic rendering can be achieved with improved photometric lights, with new presets added such as Carbon Arc, Sodium Vapor, Metal Halide, and more. Vue xStream 2014.5 now supports Autodesk 2015 3D products (3ds Max 2015, Maya 2015 and Softimage 2015), and also adds support for V-Ray 3. The update is free of charge for all registered users of Vue 2014 xStream and Infinite, and for new users pricing starts at $2,000.

The Exchange Area was created to help artists collectively improve by exchanging tips and ideas, as well as share content freely.
Now officially launched, Plant Factory users can upload content in its native format and share it for free with other software users. Sharing does not require a vendor account, however users who prefer to sell their content must open an account with Cornucopia 3D. Once an item is uploaded, it becomes immediately available to the entire Exchange Area community.

DevonThink 2.7.6 and DevonSphere Express 1.7 updated, improved workflow

DevonTechnologies has announced updates to its productivity software, DevonThink and DevonSphere Express, adding improved workflow to each. DevonThink v2.7.6 allows users to create customized tables of content for exporting documents as a website. DevonThink (starting at $50) creates standard “index.html” files during the export, and clipping web pages is possible as HTML webarchive, or PDF as well as Markdown.

DevonSphere Express ($10) provides related data when users work with documents. Version 1.7 adds support for OS X Mavericks Finder tags and comments, and indexes the Messages archive as well. A new category allows indexing of folders, which is faster, smaller, and includes a right-click menu icon that provides access to frequently-used commands. All editions of DevonThink and DevonExpress require OS X 10.6.8 or later.

by MacNN Staff


Article source:

Briefly: New VUE 2014.5 and Exchange Area, DevonTechnologies' updates

E-on Software releases VUE 2014.5 and its new Exchange Area for 3D files

E-on Software has made two additional announcements this week, including the release of VUE 2014.5 xStream and Infinite, and Cornucopia3D has opened the file Exchange Area. VUE 2014.5 xStream and Infinite are the latest production-driven versions for the creation, animation and rendering of natural 3D environments, designed for graphics professionals. Improved support for Plant Factory vegetation has been added, speeding up the rendering of plants by up to twenty times. Realistic rendering can be achieved with improved photometric lights, with new presets added such as Carbon Arc, Sodium Vapor, Metal Halide, and more. VUE xStream 2014.5 now supports Autodesk 2015 Media and Entertainment products (3ds Max 2015, Maya 2015 and Softimage 2015), and also adds support for V-Ray 3 and 3ds Max. The update is free of charge for all registered users of VUE 2014 xStream and Infinite, and for new users pricing starts at $2000.

The Exchange Area was created to help artists collectively improve by exchanging tips and ideas, as well as share content freely. Now officially launched, Plant Factory users can upload content in its native format and share it for free with other software users. Sharing does not require a vendor account, however users who prefer to sell their content must open an account with Cornucopia 3D. Once an item is uploaded, it becomes immediately available to the entire Exchange Area community.

DevonThink 2.7.6 and DevonSphere Express 1.7 updated, improved workflow

DevonTechnologies has announced updates to its productivity software, DevonThink and DevonSphere Express, adding improved workflow to each. DevonThink v2.7.6 allows users to create customized tables of content for exporting documents as a website. DevonThink (starting at $50) creates standard “index.html” files during the export, and clipping web pages is possible as HTML webarchive, or PDF as well as Markdown.

DevonSphere Express ($10) provides related data when users work with documents. Version 1.7 adds support for OS X Mavericks Finder tags and comments, and indexes the Messages archive, too. A new category allows indexing of folders, which is faster, smaller, and includes a right-click menu icon that provides access to frequently used commands. All editions of DevonThink and DevonExpress require OS X 10.6.8.

by MacNN Staff


Article source:

HS2 chair calls for better national rail plan

Britain needs to rethink its plans for the existing rail network if HS2 is to deliver maximum benefit for the country’s northern cities, the project’s new chairman said yesterday.

Speaking at the launch of a report recommending the government bring the second, northern phase of the proposed high-speed rail network forward by three years, Sir David Higgins said the scheme had to be better integrated with efforts to improve east-west connections in the North of England.

His report – originally commissioned to suggest cost savings – also recommended building a new regional transport hub at Crewe that would bring high-speed services to the North six years earlier than originally planned.

Though the report, entitled HS2 Plus, didn’t include a reduction in the total estimated budget of the project, Higgins did recommend shelving the planned £700m link with the Channel Tunnel rail link (HS1) in favour of examining other proposals, and reviving plans for a complete redevelopment of Euston station in London using private investment.

Higgins said government, rail authorities and the business community must come together to produce a more integrated transport plan that will maximise the benefits of HS2 by making it the spine of a modern rail system.

‘High Speed 2 has the potential to transform the North, not just individual cities but the region as a whole. But this will only be the case if we can see the bigger picture. So far the attention has been on individual places.

‘We need to think broader than that, properly coordinating HS2 with not just the existing network but also plans for its improvement during the time in which HS2 will be built. That would create the real possibility of improving journey times not just north-south, but also east-west.’

Integrated plan

High Speed 2 will link London, Birmingham, Manchester and Leeds in a Y-shaped network with trains travelling at up to 225mph but also transferring to the existing network to travel further north at conventional speeds.

Higgins told The Engineer that this would radically solve the issue of overly long journeys between London, the Midlands and the north of England. ‘But there’s still a gap, which is east-west,’ he said. ‘Money will be spent at the same time as we build High Speed 2. It’s not a case of either-or. It’s about the same amount of money to be spent on the existing network, but where is it going to be spent?’

This would include ‘an ambitious plan’ for better connecting Manchester and Leeds but should also include locations not directly served by HS2, he added. ‘You look at cities like Bradford, Wakefield, Barnsley or Stoke. You have to be able to show how those cities will benefit from this investment by either connecting into the new line or upgrading the existing ones.’

Higgins’ specific proposals included building a new rail and road interchange station south of Crewe, rather than tunnelling under the city to link to the existing station. The new hub and the line south to Birmingham would still form part of the second phase of construction but could then be opened three years before the whole scheme is complete.

Because Higgins also believes Phase Two could be completed by 2030 rather than 2033, this means the Crewe station would bring some of the benefits of faster journey times to the North six years earlier than originally planned.

‘You can’t build the phases at the same time because they require huge documents that take two to three years to prepare and two to three years to get through parliament,’ he said. ‘But you can look at what other things you can do in the North that don’t require a bill.’

He gave the example of the planned Northern Hub scheme to alleviate rail bottlenecks through Manchester and speed up east-west services, which he said could be brought forward.

Upgrading Euston

In London, Higgins proposed redesigning the current plans for Euston to include a “level deck design” that would allow access from one side of the station to another and enable more over-site development of shops, housing and offices.

‘Euston will need upgrading in 10 years anyway: let’s do it properly,’ said Higgins. ‘We could not just restore the grandeur of the Euston arch, but build something that rivals [recently redeveloped London stations] King’s Cross and St Pancras.’

He also called the proposed link with HS1 services at St Pancras ‘an imperfect compromise’ due to cost, a lack of evidence for demand for direct services from northern England to Paris, and the impact on existing passenger services, freight and the local community. He recommended the government commission a study of how else the two lines might be joined, for example by a new tunnel or by improving passenger transit facilities.

But, he added, the money saved from the HS1 link, which has been seen as a way of connecting northern cities directly with the continent, should not be funnelled into the Euston redevelopment. ‘Euston must stand on its own two feet,’ he said.

On the overall cost, Higgins said uncertainty over the project meant it would be irresponsible to the contingency budget for Phase One of the project and that it was too early to make a judgement on Phase Two, but accelerating Phase Two would open the door to more cost savings. ‘The more certainty there is about the timescale the more certainty there will be about controlling costs,’ he said.

Government welcome

The transport secretary, Patrick McLoughlin, said the government supported the ambition to bring the benefits of HS2 to the North more quickly and that he would now commission HS2 and Network Rail to produce more detailed plans based on Higgins’ proposals so they could be considered as part of the public consultation on Phase Two.

The Institution of Civil Engineers (ICE) also welcomed the proposals. ‘We see no fundamental engineering reason why the line could not be operational earlier than 2033 and experience around the world also indicates this is possible,’ said ICE director general, Nick Baveystock.

‘Government’s efforts to make the case for HS2 must continue and importantly, it should position the project as an integral part of a national transport strategy rather than a project developed in isolation. This includes further work to help strengthen connectivity for those locations not directly served by HS2.’

But Joe Rukin, campaign manager for the group Stop HS2, questioned whether the money for sufficient east-west connections would be delivered if HS2 went ahead. ‘If you had a national infrastructure plan you wouldn’t build HS2,’ he told The Engineer. ‘That’s the problem: it’s always been looked at in isolation.’


if (typeof InPgMPU != ‘undefined’) { document.write(InPgMPU); }



if (currentURL.indexOf(“1011139.article”) != -1) {
} else {


if (primaryNav == “Regulation” || primaryNav == “technical”){



If you enjoyed this article, sign up here to receive daily email updates from Money Marketing and




Article source:

Drowned In Money

We all know what’s gone wrong, or we think we do: not enough spending on flood defences. It’s true that the government’s cuts have exposed thousands of homes to greater risk, and that the cuts will become more dangerous as climate change kicks in(1).  But too little public spending is a small part of problem. It is dwarfed by another factor, which has been overlooked in discussions in the media and statements by the government: too much public spending.

Vast amounts of public money—running into the billions—are spent every year on policies that make devastating floods inevitable. This is the story that has not been told by the papers or the broadcasters, a story of such destructive perversity that the Guardian has given me twice the usual space today in which to explain it.

Flood defence, or so we are told almost everywhere, is about how much concrete you can pour. It’s about not building houses in stupid places on the floodplain, and about using clever new engineering techniques to defend those already there(2). None of that is untrue, but it’s a small part of the story. To listen to the dismal debates of the past fortnight you could be forgiven for believing that rivers arise in the plains; that there is no such thing as upstream; that mountains, hills, catchments and watersheds are irrelevant to the question of whether or not homes and infrastructure get drowned.

The story begins with a group of visionary farmers at Pontbren, in the headwaters of Britain’s longest river, the Severn. In the 1990s they realised that the usual hill farming strategy—loading the land with more and bigger sheep, grubbing up the trees and hedges, digging more drains—wasn’t working. It made no economic sense, the animals had nowhere to shelter, the farmers were breaking their backs to wreck their own land.

So they devised something beautiful. They began planting shelter belts of trees along the contours. They stopped draining the wettest ground and built ponds to catch the water instead. They cut and chipped some of the wood they grew to make bedding for their animals, which meant that they no longer spent a fortune buying straw. Then they used the composted bedding, in a perfect closed loop, to cultivate more trees(3).

One day a government consultant was walking over their fields during a rainstorm. He noticed something that fascinated him: the water flashing off the land suddenly disappeared when it reached the belts of trees the farmers had planted. This prompted a major research programme, which produced the following astonishing results: water sinks into the soil under the trees at 67 times the rate at which it sinks into the soil under the grass(4). The roots of the trees provide channels down which the water flows, deep into the ground. The soil there becomes a sponge, a reservoir which sucks up water then releases it slowly. In the pastures, by contrast, the small sharp hooves of the sheep puddle the ground, making it almost impermeable: a hard pan off which the rain gushes.

One of the research papers estimates that, even though only 5% of the Pontbren land has been reforested, if all the farmers in the catchment did the same thing, flooding peaks downstream would be reduced by some 29%(5). Full reforestation would reduce the peaks by around 50%(6). For the residents of Shrewsbury, Gloucester and the other towns ravaged by endless Severn floods, that means, more or less, problem solved.

Did I say the results were astonishing? Well, not to anyone who has studied hydrology elsewhere. For decades the British government has been funding scientists working in the tropics, and using their findings to advise other countries to protect the forests or to replant trees in the hills, to prevent communities downstream from being swept away. But we forgot to bring the lesson home.

So will the rest of the Severn catchment, and those of the other unruly waterways of Britain, follow the Pontbren model? The authorities say they would love to do it(7). In theory. Natural Resources Wales told me that these techniques “are hard wired in to the actions we want land managers to undertake.”(8) What it forgot to say is that all tree planting grants in Wales have now been stopped. The offices responsible for administering them are in the process of closing down(9). If other farmers want to copy the Pontbren model, not only must they pay for the trees themselves; but they must sacrifice the money they would otherwise have been paid for farming that land.

For—and here we start to approach the nub of the problem—there is an unbreakable rule laid down by the Common Agricultural Policy. If you want to receive your single farm payment—by the far biggest component of farm subsidies—that land has to be free from what it calls “unwanted vegetation”(10). Land covered by trees is not eligible. The subsidy rules have enforced the mass clearance of vegetation from the hills.

Just as the tree planting grants have stopped, the land clearing grants have risen. In his speech to the Oxford Farming Conference, made during the height of the floods, the environment secretary Owen Paterson boasted that hill farmers “on the least-productive land” will now receive “the same direct payment rate on their upland farmland as their lowland counterparts.”(11) In other words, even in places where farming makes no sense because the land is so poor, farmers will now be paid more money to keep animals there. But to receive this money, they must first remove the trees and scrub that absorb the water falling on the hills.

And that’s just the start of it. One result of the latest round of subsidy negotiations—concluded in June last year—is that governments can now raise the special mountain payments, whose purpose is to encourage farming at the top of the watersheds, from €250 per hectare to €450(12). This money should be renamed the flooding subsidy: it pays for the wreckage of homes, the evacuation of entire settlements, the drowning of people who don’t get away in time, all over Europe. Pig-headed idiocy doesn’t begin to describe it.

The problem is not confined to livestock in the mountains. In the foothills and lowlands, the misuse of heavy machinery, overstocking with animals and other forms of bad management can—by compacting the soil—increase the rates of instant run-off from 2% of all the rain that falls on the land to 60%(13).

Sometimes, ploughing a hillside in the wrong way at the wrong time of the year can cause a flood—of both mud and water—even without exceptional rainfall. This practice has blighted homes around the South Downs (that arguably should never have been ploughed at all). One house was flooded 31 times in the winter of 2000-2001 by muddy floods caused by ploughing(14). Another, in Suffolk, above which the fields had been churned up by pigs, was hit 50 times(15). But a paper on floods of this kind found that “there are no (or only very few) control measures taken yet in the UK.”(16)

Under the worst environment secretary this country has ever suffered, there seems little chance that much of this will change. In November, in response to calls to reforest the hills, Owen Paterson told parliament “I am absolutely clear that we have a real role to play in helping hill farmers to keep the hills looking as they do.”(17) (Bare, in other words). When asked by a parliamentary committee to discuss how the resilience of river catchments could be improved, the only thing he could think of was building more reservoirs(18).

But while he is cavalier and ignorant when it comes to managing land to reduce the likelihood of flooding, he goes out of his way to sow chaos when it comes to managing rivers.

Many years ago, river managers believed that the best way to prevent floods was to straighten, canalise and dredge rivers along much of their length, to enhance their capacity for carrying water. They soon discovered that this was not just wrong but counterproductive. A river can, at any moment, carry very little of the water that falls on its catchment: the great majority must be stored in the soils and on the floodplains.

By building ever higher banks around the rivers, by reducing their length through taking out the bends and by scooping out the snags and obstructions along the way, engineers unintentionally did two things. They increased the rate of flow, meaning that flood waters poured down the rivers and into the nearest towns much faster. And, by separating the rivers from the rural land through which they passed, they greatly decreased the area of functional floodplains(19,20,21).

The result, as authorities all over the world now recognise, was catastrophic. In many countries, chastened engineers are now putting snags back into the rivers, reconnecting them to uninhabited land that they can safely flood and allowing them to braid and twist and form oxbow lakes. These features catch the sediment and the tree trunks and rocks which otherwise pile up on urban bridges, and take much of the energy and speed out of the river. Rivers, as I was told by the people who had just rewilded one in the Lake District—greatly reducing the likelihood that it would cause floods downstream—“need something to chew on”(22,23).

There are one or two other such projects in the UK: Paterson’s department is funding four rewilding schemes, to which it has allocated a grand total of, er, £1 million(24). Otherwise, the secretary of state is doing everything he can to prevent these lessons from being applied. Last year he was reported to have told a conference that “the purpose of waterways is to get rid of water”(25). In another speech he lambasted the previous government for a “blind adherence to Rousseauism” in refusing to dredge(26). Not only will there be more public dredging, he insists: but there will also be private dredging: landowners can now do it themselves(27).

After he announced this policy, the Environment Agency, which is his department’s statutory adviser, warned that dredging could “speed up flow and potentially increase the risk of flooding downstream.”(28) Elsewhere, his officials have pointed out that “protecting large areas of agricultural land in the floodplain tends to increase flood risk for downstream communities.”(29) The Pitt Review, commissioned by the previous government after the horrible 2007 floods, concluded that “dredging can make the river banks prone to erosion, and hence stimulate a further build-up of silt, exacerbating rather than improving problems with water capacity.”(30) Paterson has been told repeatedly that it makes more sense to pay farmers to store water in their fields, rather than shoving it off their land and into the towns.

But he has ignored all this advice and started seven pilot projects in which farmers will be permitted to drag all that messy wildlife habitat out of their rivers, to hurry the water down to the nearest urban pinchpoint(31). Perhaps we shouldn’t be surprised to discover that Paterson has demanded massive cuts at the Environment Agency, including many of the staff responsible for preventing floods(32).

Since 2007, there has been a review, a parliamentary enquiry, two bills, new flood management programmes(33), but next to nothing has changed. Floods, because of the way we manage our land and rivers, remain inevitable. We pay a fortune in farm subsidies and river-mangling projects to have our towns flooded and homes and lives wrecked. We pay again in the form of the flood defences necessitated by these crazy policies, and through the extra insurance payments—perhaps we should call them the Paterson tax—levied on all homes. But we also pay through the loss of everything else that watersheds give us: beauty, tranquility, wildlife and, oh yes, the small matter of water in the taps.

In the Compleat Angler, published in 1653, Izaac Walton wrote this. “I think the best Trout-anglers be in Derbyshire; for the waters there are clear to an extremity.”(34) No longer. Last summer I spent a weekend walking along the River Dove and its tributaries, where Walton used to fish. All along the river, including the stretch on which the fishing hut built for him by Charles Cotton still stands, the water was a murky blueish brown. The beds of clean gravel he celebrated were smothered in silt: on some bends the accretions of mud were several feet deep.

You had only to raise your eyes to see the problem: the badly-ploughed hills of the mid-catchment and above them the drained and burnt moors of the Peak District National Park, comprehensively trashed by grouse shooting estates. A recent report by Animal Aid found that grouse estates in England, though they serve only the super-rich, receive some £37m of public money every year in the form of subsidies(35). Much of this money is used to cut and burn them, which is likely to be a major cause of flooding(36). Though there had been plenty of rain throughout the winter and early spring, the river was already low and sluggish.

A combination of several disastrous forms of upland management has been helping Walton’s beloved river to flood, with the result that both government and local people have had to invest heavily in the Lower Dove flood defence scheme(37). But this wreckage has also caused it to dry up when the rain doesn’t fall.

That’s the flipside of a philosophy which believes that land exists only to support landowners, and waterways exist only “to get rid of water”. Instead of a steady flow sustained around the year by trees in the hills, by sensitive farming methods, by rivers which are allowed to find their own course and their own level, to filter and hold back their waters through bends and braiding and obstructions, we get a cycle of flood and drought. We get filthy water and empty aquifers and huge insurance premiums and ruined carpets. And all of it at public expense.

First published in the Guardian. Courtesy:




3. Coed Cadw and Coed Cymru, no date given.The Pontbren Project
A farmer-led approach to sustainable land management in the uplands.

4. M. R. Marshall et al, 2013. The impact of rural land management changes on soil hydraulic properties and runoff processes: results from experimental plots in upland UK. Hydrological Processes, DOI: 10.1002/hyp.9826.

5. Howard Wheater et al, 2008. Impacts of upland land management on flood risk: multi-scale modelling methodology and results from the Pontbren experiment. FRMRC Research Report UR 16.

6. As above.

7. See for example Natural England, Environment Agency, Defra, Welsh Government et al, 2012. Greater working with natural processes in flood and coastal erosion risk management.

8. NRW, 9th January 2014, by email.

9. I talked to one of the employees over the weekend: everyone is being made redundant as all funding has ceased.

10. Official Journal of the European Union, 31st January 2009. Council Regulation (EC) No 73/2009 of 19 January 2009, establishing common rules for direct support schemes for farmers under the common agricultural policy and establishing certain support schemes for farmers, amending Regulations (EC) No 1290/2005, (EC) No 247/2006, (EC) No 378/2007 and repealing Regulation (EC) No 1782/2003. Annex III.

This rule remains unchanged in the current round.


12. European Commission, 26th June 2013. CAP Reform—an explanation of the main elements.

13. Natural England, Environment Agency, Defra, Welsh Government et al, 2012. Greater working with natural processes in flood and coastal erosion risk management.

14. John Boardman and Karel Vandaele , 2010. Soil erosion, muddy floods and the need for institutional memory. Area (2010) 42.4, 502–513 doi: 10.1111/j.1475-4762.2010.00948.x

15. R. Evans, 2010. Runoff and soil erosion in arable Britain: changes in perception and policy since 1945. Environmental Science and Policy 13, pp 1 4 1—1 4 9. doi:10.1016/j.envsci.2010.01.001

16. John Boardman and Karel Vandaele , 2010. Soil erosion, muddy floods and the need for institutional memory. Area (2010) 42.4, 502–513 doi: 10.1111/j.1475-4762.2010.00948.x


18. Owen Paterson, 2013. In evidence to the Environment, Food and Rural Affairs Committee. Managing Flood Risk, Volume I.

19. I am grateful to Dr Richard Hey and to Charles Rangely-Wilson for the discussions we had about these issues.

20. Natural England, Environment Agency, Defra, Welsh Government et al, 2012. Greater working with natural processes in flood and coastal erosion risk management.

21. Sir Michael Pitt, 2008. Learning lessons from the 2007 floods. The Pitt Review.

22. See

23. I hope before long to write up the extraordinary story I was told by a representative of United Utilities about the sharply differing responses of the rewilded River Liza in Ennerdale and the still-canalised St John’s Beck in Thirlmere
to the famous 2009 downpour.

24. Natural England, Environment Agency, Defra, Welsh Government et al, 2012. Greater working with natural processes in flood and coastal erosion risk management.




28. Judy England and Lydia Burgess-Gamble, August 2013. Evidence: impacts of dredging.

29. Environment Agency, 2009. River Severn Catchment Flood Management Plan.
Summary Report.

30. Sir Michael Pitt, 2008. Learning lessons from the 2007 floods. The Pitt Review.



33. Defra and the Environment Agency, 2011. Understanding the risks, empowering communities, building resilience: the national flood and coastal erosion risk management strategy for England.

34. Chapter XVII.

35. Animal Aid, 2013. Calling the Shots: the power and privilege of the grouse-shooting elite.

36. See also the Upper Calder Valley Ban the Burn campaign.


Article source:

NCAE lawsuit challenges elimination of teacher tenure

Classroom generic

— The North Carolina Association of Educators is continuing its legal challenges to laws adopted by the General Assembly this year.

A week after filing a lawsuit over the state’s new voucher program to help low-income students attend private schools, the teachers organization and six individual teachers on Tuesday filed a lawsuit over a provision in the state budget that eliminated the “career status” protections afforded to veteran teachers.

“Career status repeal is part of a full frontal assault by the legislative majority on public education in North Carolina,” NCAE President Rodney Ellis said in a statement. “It’s part of a full frontal assault on the teachers, the children, the families and the future of our state. No wonder teachers are leaving our state in droves.”

Under career status, commonly referred to as tenure, teachers were given extra due process rights, including the right to a hearing if they were disciplined or fired.

Bill Harrison, a former chairman of the State Board of Education, said legislative leaders are relying on a common misconception about tenure to build support for the move.

“The notion that career status makes it impossible to get rid of under-performing teachers is a myth,” Harrison said in a statement. “We need to be concerned about keeping the excellent teachers we have.”

Senate President Pro Tem Phil Berger, who initially crafted the tenure elimination proposal, and House Speaker Thom Tillis quickly labeled the lawsuit “frivolous,” noting that teachers will now be rewarded for job performance instead of having a tenure system that “fosters mediocrity and discourages excellence.”

“By filing another frivolous lawsuit, the union has made its blueprint clear: ‘If at first you don’t succeed at the polls, then sue, sue again,’” Berger and Tillis said in a statement. “While union leaders are focused on succeeding in the courtroom, we’ll remain focused on our children succeeding in the classroom.”

Several Triangle-area teachers are among the plaintiffs in the lawsuit: Brian Link, who teaches at East Chapel Hill High School, Rich Nixon, who teaches in Johnston County, and Rhonda Holmes, who teaches in Northampton County.

Link, who worked as a lawyer in New York before getting into teaching, said he chose to work in North Carolina instead of Florida because of this state’s career protections for teachers.

“I believed this state respected and valued its teachers,” he said in a statement. “Now, three years into my career, I will have none of those basic employment rights that first made me want to come here.”

Nixon, who has taught history for 34 years, said the state is backing out of a key protection upon which teachers rely.

“Teaching is an honorable profession. North Carolina should do the honorable thing – live up to its promise and uphold the contract it made with me in 1978,” he said.

NCAE attorney Ann McColl said “fundamental constitutional principles are being compromised” in the elimination of career status. She noted that neither it nor the voucher proposal were able to pass the General Assembly as standalone bills, so they were tucked into the budget, which was hammered out behind closed doors with no public input and little debate.

Looking for comments?

Article source:

Oh, the Humanity: Is the Threat of Overpopulation Still a Big Deal?

PEOPLE PLANET: A crowded street in Kathmandu, Nepal

Image: Pavel Novak

E – The Environmental Magazine

Dear EarthTalk: Is it true that human overpopulation isn’t such a big issue anymore, as numbers are expected to start declining in a few decades?—Melinda Mason, Boone, Iowa

Ever since Thomas Malthus published “An Essay on the Principle of Population” in 1798, positing incorrectly that humans’ proclivity for procreation would exhaust the global food supply within a matter of decades, population growth has been a hot button issue among those contemplating humankind’s future. Indeed our very success going forth and multiplying, paired with our ability to extend our life expectancy, has meant that we are perpetually pushing the limits of the resource base that supports us.

When Malthus was worrying about the planet’s “carrying capacity,” there were only about a billion of us on the planet. Today our population tops seven billion. While better health care and medicine along with advances in food production and access to freshwater and sanitation have allowed us to feed ourselves and stave off many health ills, some so-called Neo-Malthusians believe we may still be heading for some kind of population crash, perhaps triggered or exacerbated by environmental factors related to climate change.

But others are less concerned given projections that world population will likely start to decline once the world’s less developed nations urbanize and start lowering their birth rates, as has already happened in Europe, the U.S., Australia and parts of Asia. For example, Europe’s “fertility rate” between 2005 and 2010 was just 1.53 live births per woman (the standard replacement rate to maintain a stable population is 2.1). Without immigration, Europe’s population would already be shrinking.

Of course, the immigration that continues to fuel population numbers in developed countries is coming from somewhere. Indeed, population numbers are still growing in many of the world’s developing countries, including the world’s most populous nation, China, and its close rival, India. Also fertility rates in Africa continue to be among the highest in the world, as many countries there are growing fast, too. Poverty and health problems due to poor sanitation, lack of access to food and water, the low social status of women and other ills continue to cripple these regions. Overpopulation could plague us indefinitely if fertility rates don’t drop in these areas, especially as they ramp up their Western-style development.

Globally, the United Nations estimates that the number of humans populating the planet in 2100 will range from as few as 6.2 billion—almost a billion less than today—to as many as 15.8 billion on the high end. Meanwhile, other researchers confirm the likelihood of world population levels flattening out and starting to decline by 2100 according to the lower UN estimate. To wit, the Austria-based International Institute for Applied Systems Analysis (IIASA) recently unveiled research showing that if the world stabilizes at a fertility rate comparable to that of many European nations today (roughly 1.5), the global human population will be only half of what it is today by the year 2200, and only one-seventh by 2300.

It is difficult to say which way the global population pendulum will swing in centuries to come, given ever-changing cultural, economic and political attitudes and the development demographics they affect. As such the jury is still out as to whether human overpopulation will become a footnote in history or the dominant ill that stands in the way of all other efforts to achieve sustainability and a kinder, gentler world.

CONTACTS: Thomas Malthus,; United Nations,‎; IIASA,

Article source: