September 18, 2014

Priorities & Perseverance

Robbie Williamson

Screenshot from 2014-09-17 22:48:22

This is a not a stock ticker, rather a health ticker…and unlike with a stock price, a downward trend is good.  Over the last 3 years or so, I’ve been on a personal mission of improving my health.  As you can see it wasn’t perfect, but I managed to lose a good amount of weight.

So why did I do it…what was the motivation…it’s easy, I decided in 2011 that I needed to put me first.   This was me from 2009

Screenshot from 2014-09-10 09:54:50IMG_84318618356313

At my biggest, I was pushing 270lbs.  I was so busy trying to do for others, be it work, family, or friends, I was constantly putting my needs last, i.e. exercise and healthy eating.  You see, I actually like to exercise and healthy eating isn’t a hard thing for me, but when you start putting those things last on your priorities, it becomes easy to justify skipping the exercise or grabbing junk food because your short on time or exhausted from being the “hero”.

Now I have battled weight issues most of my life.  Given how I looked as a baby, this shouldn’t come as a surprise. LOL

20140917_231831

But I did thin out as a child.

530336_10151620134146242_1946930333_n

To only get bigger again

20140917_231901

And even bigger again

20140917_232423

But then I got lucky.  My metabolism kicked into high gear around 20, and I grew about 5 inches and since I was playing a ton of basketball daily, I ate anything I wanted and still stayed skinny

10475956_10152202484326242_6086082912878589217_o

I remained so up until I had my first child, then the pounds began to come on.  Many parents will tell you that the first time is always more than you expected, so it’s not surprising with sleep deprivation and stress, you gain weight.  To make it even more fun, I had decide to start a new job and buy a new house a few years later, when my second child came…even more “fun”.

2014-08-24 22.07.43

To be clear, I’m not blaming any of my weight gain on these events, however they became easy crutches to justify putting myself last.  And here’s the crazy part, by doing all this, I actually ended up doing less for those I cared about in the long run, because I was physically exhausted, mentally fatigued, and emotionally spent a lot of the time.

So, around October of 2012 I made a decision.  In order for me to be the man I wanted to be for my family, friends, and even colleagues, I had to put myself first.  While it sounds selfish, it’s the complete opposite.  In order to be the best I could be for others, I realized I had to get myself together first.  For those of you who followed me on Facebook then, you already know what it took…a combination of MyFitnessPal calorie tracking and a little known workout program called Insanity:

Insanity-Workout

Me and my boy, Shaun T, worked out religiously…everyday…sometimes mornings…sometimes afternoons…sometimes evenings.  I carried him with me all for work travel on my laptop and phone…doing Insanity videos in hotels rooms around the world.  I did the 60day program about 4 times through (with breaks in between cycles)…adding in some weight workouts towards the end.  The results were great, as you can see in the first graphic starting around October 2012.  By staying focused and consistent, I dropped from about 255lbs to 226lbs at my lowest in July 2013.  I got rid of a lot of XXL shirts and 42in waist pants/shorts, and got to a point where I didn’t always feel the need to swim with a shirt on….if ya know what I mean ;-).  So August rolled around, and while I was feeling good about myself…didn’t feel great, because I knew that while I was lighter, and healthier, I wasn’t necessarily that much stronger.  I knew that if I wanted to really be healthy and keep this weight off, I’d need more muscle mass…plus I’d look better too :-P.

So the Crossfit journey began.

Now I’ll be honest, it wasn’t my first thought.  I had read all the horror stories about injuries and seen some of the cult-like stuff about it.  However, a good friend of mine from college was a coach, and pretty much called me out on it…she was right…I was judging something based on others opinions and not my own (which is WAY outta character for me).  So…I went to my first Crossfit event…the Women’s Throwdown in Austin, TX (where I live) held by Woodward Crossfit in July of 2013.  It was pretty awesome….it wasn’t full of muscle heads yelling at each other or insane paleo eating nut jobs trying to out shine another…it was just hardworking athletes pushing themselves as hard as they could…for a great cause (it’s a charity event)…and having a lot of fun.  I planned to only stay for a little bit, but ended up staying the whole damn day! Long story, short…I joined Woodward Crossfit a few weeks after (the delay was because I was determined to complete my last Insanity round, plus I had to go on a business trip), which was around the week of my birthday (Aug 22).

download

1381407_609309165778302_680124169_n

Fast forward a little over a year, with a recently added 21-day Fitness Challenge by David King (who also goes to the same gym), and as of today I’m down about 43lbs (212), with a huge reduction in body fat percentage.  I don’t have the starting or current percentage, but let’s just say all 43lbs lost was fat, and I’ve gained a good amount of muscle in the last year as well…which is why the line flattened a bit before I kicked it up another notch with the 21-Day last month.

Now I’m not posting any more pictures, because that’s not the point of this post (but trust me…I look goooood :P).  My purpose is exactly what the subject says, priorities & perseverance.  What are you prioritizing in your life?  Are you putting too many people’s needs ahead of your own?  Are you happy as a result?  If you were like me, I already know the answer…but you don’t have to stay this way.  You only get one chance at this life, so make the most out of it.  Make the choice to put your happiness first, and I don’t mean selfishly…that’s called pleasure.  You’re happier when your loved ones are doing well and happy…you’re happier when you have friends who like you and that you can depend on….you’re happier when you kick ass at work…you’re happier when you kill it on the basketball court (or whatever activity you like).  Make the decision to be happy, set your goals, then perservere until you attain them…you will stumble along the way…and there will be those around you who either purposely or unknowingly discourage you, but stay focused…it’s not their life…it’s yours.  And when it gets really hard…just remember the wise words of Stuart Smalley:


on September 18, 2014 05:32 AM

The new Ubuntu Touch operating system from Canonical will power the new Meizu MX4 phone and it will be out in December, according to the latest information posted by the Chinese company. We now take a closer look at this new phone to see how it will hold up with an Ubuntu experience.

Canonical hasn’t provided any kind of information about a timetable for the launch of the new Ubuntu phone from Meizu, and even the information that we have right now has been posted initially on an Italian blog of the Chinese company. Basically, no one is saying anything officially, but that’s not really the point.

The new Meizu MX4 was announced just a couple of weeks ago and many Ubuntu users have asked themselves if this is the phone that will eventually feature the upcoming Ubuntu Touch. It looks like that is the case, so we now take a closer look at this powerful handset.

Source:

http://news.softpedia.com/news/Everything-You-Need-to-Know-About-Meizu-MX4-the-Upcoming-Ubuntu-Phone-458882.shtml

Submitted by: Silviu Stahie

on September 18, 2014 05:04 AM

Ubuntu shell overpowered

Ayrton Araujo

In order to have more productivity under my environment, as a command line centric guy, I started three years ago to use zsh as my defaul shell. And for who never tried it, I would like to share my personal thoughts.

What are the main advantages?

  • Extended globbing: For example, (.) matches only regular files, not directories, whereas az(/) matches directories whose names start with a and end with z. There are a bunch of other things;
  • Inline glob expansion: For example, type rm *.pdf and then hit tab. The glob *.pdf will expand inline into the list of .pdf files, which means you can change the result of the expansion, perhaps by removing from the command the name of one particular file you don’t want to rm;
  • Interactive path expansion: Type cd /u/l/b and hit tab. If there is only one existing path each of whose components starts with the specified letters (that is, if only one path matches /u/l/b*), then it expands in place. If there are two, say /usr/local/bin and /usr/libexec/bootlog.d, then it expands to /usr/l/b and places the cursor after the l. Type o, hit tab again, and you get /usr/local/bin;
  • Nice prompt configuration options: For example, my prompt is currently displayed as tov@zyzzx:/..cts/research/alms/talk. I prefer to see a suffix of my current working directory rather than have a really long prompt, so I have zsh abbreviate that portion of my prompt at a maximum length.

Font: http://www.quora.com/What-are-the-advantages-and-disadvantages-of-using-zsh-instead-of-bash-or-other-shells

The Z shell is mainly praised for its interactive use, the prompts are more versatilly, the completion is more customizable and often faster than bash-completion. And, easy to make plugins. One of my favorite integrations is with git to have better visibility of current repository status.

As it focus on interactive use, is a good idea to keep maintaining your shell scripts starting with #!/bin/bash for interoperability reasons. Bash is still most mature and stable for shell scripting in my point of view.

So, how to install and set up?

sudo apt-get install zsh zsh-lovers -y

zsh-lovers will provide to you a bunch of examples to help you understand better ways to use your shell.

To set zsh as the default shell for your user:

chsh -s /bin/zsh

Don't try to set zsh as default shell to your full system or some things should stop to work.

Two friends of mine, Yuri Albuquerque and Demetrius Albuquerque (brothers of a former hacker family =x) also recommended to use https://github.com/robbyrussell/oh-my-zsh. Thanks for the tip.

How to install oh-my-zsh as a normal user?

curl -L http://install.ohmyz.sh | sh

My $ZSH_THEME is seted to "bureau" under my $HOME/.zshrc. You can try "random" or other themes located inside $HOME/.oh-my-zsh/themes.

For command-not-found integration:

echo "source /etc/zsh_command_not_found" >> ~/.zshrc

If you doesn't have command-not-found package:

sudo apt-get install command-not-found -y

And, if you use Ruby under RVM, I also recommend to read this:
http://rvm.io/integration/zsh

Happy hacking :-)

on September 18, 2014 12:28 AM

September 17, 2014

Responsive Dummies

Stuart Langridge

After Remy “Remington” Sharp and Bruce “Bruce” Lawson published Introducing HTML5 in 2010, the web development community have been eager to see what they’ll turn their talents to next.1 Now their new book is out, Responsive Design for Dummies.

It’s… got its good points and its bad points. As the cover proudly proclaims, they fully embrace the New World Order of delivering essential features via Web Components. I particularly liked their demonstration of how to wrap a whole site inside a component, thus making your served HTML just be <bruces-site> and so saving you bandwidth2. Their recommendation that Flickr and Facebook use this approach to stop users stealing images may be the best suggestion for future-proofing the web that we’ve heard in 2014 so far. The sidebar on how to use this approach and hash-bang JavaScript URLs together ought to become the new way that we build everything, and I’m eager to see libraries designed for slow connections and accesssibility such as Angular.js adopt the technique.

Similarly, the discussion of how Service Workers can deliver business advantages on the Apple iWatch was welcome, particularly given the newness of the release. It’s rare to see a book this up-to-date and this ready to engage with driving the web forward. Did Bruce and Remy get early access to iWatch prototypes or something? I am eager to start leveraging these techniques with my new startup3.

It’s not all perfect, though. I think that devoting three whole chapters to a Dawkins-esque hymn of hatred for everyone who opposed the <picture> element was a bit more tactless than I was hoping for. You won, chaps, there’s no need to rub it in.4

I’d also like to see, if I’m honest, ideas for when breakpoints are less appropriate. I appreciate that the book comes with a free $500 voucher for Getty Images, but after at Bruce and Remy’s recommendation I downloaded separate images for breakpoints at 17px, 48px, 160px, 320px, 341px, 600px, 601px, 603px, 631px, 800px, 850px, 900px, 1280px, 2560px, and 4200px for retina Firefox OS devices, I only had $2.17 left to spend and my server has run out of disc space. Even after using their Haskell utility to convert the images to BMP and JPEG2000 formats I still only score 13.6% on the Google Pagespeed test, and my router has melted. Do better next time, chaps.

Nonetheless, despite these minor flaws, and obvious copy-editing flubs such as “responsive” being misspelled on the cover itself5, I’d recommend this book. Disclaimer: I know both the authors biblicallypersonally and while Bruce has indeed promised me “a night to remember” for a positive review, that has not affected at all my judgement of this book as the most important and seminal work in the Web field since Kierkegaard’s “Sarissa.js Tips and Tricks”.

Go and buy it. It’s so popular that it might actually be hard to find a copy, but if your bookseller doesn’t have it, you should shout at them.

  1. other than inappropriate swimwear, obviously
  2. I also liked their use of VML and HTML+TIME in a component
  3. it’s basically Uber for pie fillings
  4. although if you don’t rub it in it’ll stain the mankini
  5. clearly it was meant to say “ahahaha responsive design, what evaaaaar”, but maybe that didn’t fit
on September 17, 2014 01:11 PM

Windows applications sometimes fail to load. But why? It’ll not tell you, it will instead show a generic and pointless “Application Error” message. Inside this message you will read something like this:

The application was unable to start correctly (0xc0000142). Click OK to close the application.

The only thing you can do here is close the application and search on the Internet for that cryptic error code. And maybe it’s the reason why you are reading this post.
It’s not that easy to find a solution to this problem, but I found it thanks to Up and Ready and want to share it with you.

The problem

Windows tells you that the application was unable to start. You can try a hundred times, but the error does not solve itself magically, because it’s not casual. The problem is that the ddl that launches the application is unsigned or digitally no longer valid. And it’s not up to you, maybe you just downloaded the program from the official site.

The solution

To solve the Application Error you need an advanced Windows Sysinternals Tool called Autoruns for Windows. You can download it from the official website.

Windows Application Error Autoruns AppInit

Click on the image to view it full size.

Extract the archive you downloaded, launch autoruns.exe and go to the AppInit tab, which will list all the dll that are unsigned or digitally no longer valid on you computer. Right click each of them, one at a time, go to Properties and rename them. After renaming each of them, try launching the application again to find the problematic dll.

If the previous method didn’t solve the application error, right click on the following entry:

HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Windows\AppInit_Dlls

and click on Jump to entry…

Windows Application Error System Registry Editor

A new window opens: it’s the System Registry Editor. Double click LoadAppInit_DLLs and change the value from 1 to 0. Click OK to confirm and exit. Now launch the compromised program and it’ll start.

Note: some applications may change that value back to 1 after they get launched!

The post Windows: How to Solve Application Error 0xc0000142 and 0xc0000005 appeared first on deshack.

on September 17, 2014 01:07 PM

Linux creator Linus Torvalds is well-known for his strong opinions on many technical things. But when it comes to systemd, the init system that has caused a fair degree of angst in the Linux world, Torvalds is neutral.

“When it comes to systemd, you may expect me to have lots of colourful opinions, and I just don’t,” Torvalds told iTWire in an interview. “I don’t personally mind systemd, and in fact my main desktop and laptop both run it.

Source:

http://www.itwire.com/business-it-news/open-source/65402-torvalds-says-he-has-no-strong-opinions-on-systemd

Submitted by: Sam Varghese

on September 17, 2014 05:03 AM

September 16, 2014

Meeting Minutes

IRC Log of the meeting.

Meeting minutes.

Agenda

20140916 Meeting Agenda


Release Metrics and Incoming Bugs

Release metrics and incoming bug data can be reviewed at the following link:

  • http://people.canonical.com/~kernel/reports/kt-meeting.txt


Status: Utopic Development Kernel

The Utopic kernel remains based on a v3.16.2 upstream stable kernel and
is uploaded to the archive, ie. linux-3.16.0-15.21. Please test and let
us know your results.
I’d also like to point out that our Utopic kernel freeze date is about 3
weeks away on Thurs Oct 9. Please don’t wait until the last minute to
submit patches needing to ship in the Utopic 14.10 release.
—–
Important upcoming dates:
Mon Sep 22 – Utopic Final Beta Freeze (~1 weeks away)
Thurs Sep 25 – Utopic Final Beta (~1 weeks away)
Thurs Oct 9 – Utopic Kernel Freeze (~3 weeks away)
Thurs Oct 16 – Utopic Final Freeze (~4 weeks away)
Thurs Oct 23 – Utopic 14.10 Release (~5 weeks away)


Status: CVE’s

The current CVE status can be reviewed at the following link:

http://people.canonical.com/~kernel/cve/pkg/ALL-linux.html


Status: Stable, Security, and Bugfix Kernel Updates – Trusty/Precise/Lucid

Status for the main kernels, until today (Sept. 16):

  • Lucid – verification & testing
  • Precise – verification & testing
  • Trusty – verification & testing

    Current opened tracking bugs details:

  • http://kernel.ubuntu.com/sru/kernel-sru-workflow.html

    For SRUs, SRU report is a good source of information:

  • http://kernel.ubuntu.com/sru/sru-report.html

    Schedule:

    cycle: 29-Aug through 20-Sep
    ====================================================================
    29-Aug Last day for kernel commits for this cycle
    31-Aug – 06-Sep Kernel prep week.
    07-Sep – 13-Sep Bug verification & Regression testing.
    14-Sep – 20-Sep Regression testing & Release to -updates.


Open Discussion or Questions? Raise your hand to be recognized

No open discussion.

on September 16, 2014 05:15 PM

Ubuntu at Fossetcon 2014

Elizabeth K. Joseph

Last week I flew out to the east coast to attend the very first Fossetcon. The conference was on the smaller side, but I had a wonderful time meeting up with some old friends, meeting some new Ubuntu enthusiasts and finally meeting some folks I’ve only communicated with online. The room layout took some getting used to, but the conference staff was quick to put up signs and directing conference attendees in the right direction and in general leading to a pretty smooth conference experience.

On Thursday the conference hosted a “day zero” that had training and an Ubucon. I attended the Ubucon all day, which kicked off with Michael Hall doing an introduction to the Ubuntu on Phones ecosystem, including Mir, Unity8 and the Telephony features that needed to be added to support phones (voice calling, SMS/MMs, Cell data, SIM card management). He also talked about the improved developer portal with more resources aimed at app developers, including the Ubuntu SDK and simplified packaging with click packages.

He also addressed the concern of many about whether Ubuntu could break into the smartphone market at this point, arguing that it’s a rapidly developing and changing market, with every current market leader only having been there for a handful of years, and that new ideas need need to play to win. Canonical feels that convergence between phone and desktop/laptop gives Ubuntu a unique selling point and that users will like it because of intuitive design with lots of swiping and scrolling actions, gives apps the most screen space possible. It was interesting to hear that partners/OEMs can offer operator differentiation as a layer without fragmenting the actual operating system (something that Android struggles with), leaving the core operating system independently maintained.

This was followed up by a more hands on session on Creating your first Ubuntu SDK Application. Attendees downloaded the Ubuntu SDK and Michael walked through the creation of a demo app, using the App Dev School Workshop: Write your first app document.

After lunch, Nicholas Skaggs and I gave a presentation on 10 ways to get involved with Ubuntu today. I had given a “5 ways” talk earlier this year at the SCaLE in Los Angeles, so it was fun to do a longer one with a co-speaker and have his five items added in, along with some other general tips for getting involved with the community. I really love giving this talk, the feedback from attendees throughout the rest of the conference was overwhelmingly positive, and I hope to get some follow-up emails from some new contributors looking to get started. Slides from our presentation are available as pdf here: contributingtoubuntu-fossetcon-2014.pdf


Ubuntu panel, thanks to Chris Crisafulli for the photo

The day wrapped up with an Ubuntu Q&A Panel, which had Michael Hall and Nicholas Skaggs from the Community team at Canonical, Aaron Honeycutt of Kubuntu and myself. Our quartet fielded questions from moderator Alexis Santos of Binpress and the audience, on everything from the Ubuntu phone to challenges of working with such a large community. I ended up drawing from my experience with the Xubuntu community a lot in the panel, especially as we drilled down into discussing how much success we’ve had coordinating the work of the flavors with the rest of Ubuntu.

The next couple days brought Fossetcon proper, with I’ll write about later. The Ubuntu fun continued though! I was able to give away 4 copies of The Official Ubuntu Book, 8th Edition which I signed, and got José Antonio Rey to sign as well since he had joined us for the conference from Peru.

José ended up doing a talk on Automating your service with Juju during the conference, and Michael Hall had the opportunity to a talk on Convergence and the Future of App Development on Ubuntu. The Ubuntu booth also looked great and was one of the most popular of the conference.

I really had a blast talking to Ubuntu community members from Florida, they’re a great and passionate crowd.

on September 16, 2014 05:01 PM

New SubLoCo Policy

Ubuntu LoCo Council

Hi, after a lot of work, thinking and talking about the problem of the LoCo Organization and the SubLoCos, we came up with the following policy:

  • Each team will be a country (or state in the United States). We will call this a ‘LoCo’.
  • Each LoCo can have sub-teams. This sub-teams will be created at the will and need of each LoCo.
  • A LoCo may have sub-teams or not have sub-teams.
  • In the event a LoCo does have sub-teams, a Team Council needs to be created.
  • A Team Council is conformed by at least one member of each sub-team.
  • The members that will be part of the Team Council will be chosen by other current members of the team.
  • The Team Council will have the power to make decisions regarding to the LoCo.
  •  The Team Council will also have the power to request partner items, such as conference and DVD packs.
  • The LoCo Council will only recognize one team per country (or state in the United States). This is the team that will be in the ~locoteams team in Launchpad.
  • In the event a LoCo wants to go through the verification process, the LoCo will go through it, and not individual sub-teams.
  • LoCos not meeting the criteria of country/state teams will be denied verification.
  • In the event what is considered a sub-team wants to be considered a LoCo, it will need to present a request to the LoCo Council.
  • The LoCo Council will provide a response, which is, in no way, related to verification. The LoCo will still have to apply for verification if wanted.

We encourage the LoCo teams to see if this new form of organization is fits for you, if so please start forming subteams as you find useful. If a team needs help with this or anything else contact us, we are here to help!

on September 16, 2014 03:24 PM

September 15, 2014

Welcome to the Ubuntu Weekly Newsletter. This is issue #383 for the week September 8 – 14, 2014, and the full version is available here.

In this issue we cover:

The issue of The Ubuntu Weekly Newsletter is brought to you by:

  • Elizabeth K. Joseph
  • Jose Antonio Rey
  • And many others

If you have a story idea for the Weekly Newsletter, join the Ubuntu News Team mailing list and submit it. Ideas can also be added to the wiki!

Except where otherwise noted, content in this issue is licensed under a Creative Commons Attribution 3.0 License BY SA Creative Commons License

on September 15, 2014 11:51 PM

3 years and counting…

José Antonio Rey

On a 15th September, 3 years ago, I got my Ubuntu Membership.

There’s only thing I can say about it: it’s been the most wonderful and awesome 3 years I could have. I would’ve never thought that I would find such welcoming and amazing community.

Even though I may have not worked with you directly, thank you. You all are what makes the community awesome – I wouldn’t imagine it without one of you. We are all building the future, so let’s continue!

As I said on the title, I hope that it’s not only 3 years. I’ll keep on counting!


on September 15, 2014 04:22 PM

Back in April, I upstreamed (that is, reported a bug to Debian) regarding the `nginx-naxsi` packages. The initial bug I upstreamed was about the outdated naxsi version in the naxsi packages. (see this bug in Ubuntu and the related bug in Debian)

The last update on the Debian bug is on September 10, 2014. That update says the following, and was made by Christos Trochalakis:

After discussing it with the fellow maintainers we have decided that it is
better to remove the nginx-naxsi package before jessie is freezed.

Packaging naxsi is not trivial and, unfortunately, none of the maintainers uses
it. That’s the reason nginx-naxsi is not in a good shape and we are not feeling
comfortable to release and support it.

We are sorry for any inconvenience caused.

I asked what the expected timeline was for the packages being dropped. In a response from Christos today, September 15, 2014, it was said:

It ‘ll get merged and released (1.6.1-3) by the end of the month.


In Ubuntu, these changes will likely not make it into 14.10, but future versions of Ubuntu beyond 14.10 (such as 15.04) will likely have this change.

In the PPAs, the naxsi packages will be dropped with stable 1.6.1-3+precise0 +trusty0 +utopic0 and mainline 1.7.4-1+precise0 +trusty0 +utopic0 or will be dropped in later versions if a new point release is made before then.

In Debian, these changes are likely to hit by the end of the month (with 1.6.1-3).

on September 15, 2014 02:50 PM

Last week I attended FOSSETCON, a new open source convention here in central Florida, and I had the opportunity to give a couple of presentations on Ubuntu phones and app development. Anybody who knows me knows that I love talking about these things, but a lot fewer people know that doing it in front of a room of people I don’t know still makes me extremely nervous. I’m an introvert, and even though I have a public-facing job and work with the wider community all the time, I’m still an introvert.

I know there are a lot of other introverts out there who might find the idea of giving presentations to be overwhelming, but they don’t have to be.  Here I’m going to give my personal experiences and advice, in the hope that it’ll encourage some of you to step out of your comfort zones and share your knowledge and talent with the rest of us at meetups and conferences.

You will be bad at it…

Public speaking is like learning how to ride a bicycle, everybody falls their first time. Everybody falls a second time, and a third. You will fidget and stutter, you will lose your train of thought, your voice will sound funny. It’s not just you, everybody starts off being bad at it. Don’t let that stop you though, accept that you’ll have bruises and scrapes and keep getting back on that bike. Coincidentally, accepting that you’re going to be bad at the first ones makes it much less frightening going into them.

… until you are good at it

I read a lot of things about how to be a good and confident public speaker, the advice was all over the map, and a lot of it felt like pure BS.  I think a lot of people try different things and when they finally feel confident in speaking, they attribute whatever their latest thing was with giving them that confidence. In reality, you just get more confident the more you do it.  You’ll be better the second time than the first, and better the third time than the second. So keep at it, you’ll keep getting better. No matter how good or bad you are now, you will keep getting better if you just keep doing it.

Don’t worry about your hands

You’ll find a lot of suggestions about how to use your hands (or not use them), how to walk around (or not walk around) or other suggestions about what to do with yourself while you’re giving your presentation. Ignore them all. It’s not that these things don’t affect your presentation, I’ll admit that they do, it’s that they don’t affect anything after your presentation. Think back about all of the presentations you’ve seen in your life, how much do you remember about how the presenter walked or waved their hands? Unless those movements were integral to the subject, you probably don’t remember much. The same will happen for you, nobody is going to remember whether you walked around or not, they’re going to remember the information you gave them.

It’s not about you

This is the one piece of advice I read that actually has helped me. The reason nobody remembers what you did with your hands is because they’re not there to watch you, they’re there for the information you’re giving them. Unless you’re an actual celebrity, people are there to get information for their own benefit, you’re just the medium which provides it to them.  So don’t make it about you (again, unless you’re an actual celebrity), focus on the topic and information you’re giving out and what it can do for the audience. If you do that, they’ll be thinking about what they’re going to do with it, not what you’re doing with your hands or how many times you’ve said “um”. Good information is a good distraction from the things you don’t want them paying attention to.

It’s all just practice

Practicing your presentation isn’t nearly as stressful as giving it, because you’re not worried about messing up. If you mess up during practice you just correct it, make a note to not make the same mistake next time, and carry on. Well if you plan on doing more public speaking there will always be a next time, which means this time is your practice for that one. Keep your eye on the presentation after this one, if you mess up now you can correct it for the next one.

 

All of the above are really just different ways of saying the same thing: just keep doing it and worry about the content not you. You will get better, your content will get better, and other people will benefit from it, for which they will be appreciative and will gladly overlook any faults in the presentation. I guarantee that you will not be more nervous about it than I was when I started.

on September 15, 2014 09:00 AM

Last week’s autopkgtest 3.5 release (in Debian sid and Ubuntu Utopic) brings several new features which I’d like to announce.

Tests that reboot

For testing low-level packages like init or the kernel it is sometimes desirable to reboot the testbed in the middle of a test. For example, I added a new boot_and_services systemd autopkgtest which configures grub to boot with systemd as pid 1, reboots, and then checks that the most important services like lightdm, D-BUS, NetworkManager, and cron come up as expected. (This test will be expanded a lot in the future to cover other areas like the journal, logind, etc.)

In a testbed which supports rebooting (currently only QEMU) your test will now find an “autopkgtest-reboot” command which the test calls with an arbitrary “marker” string. autopkgtest will then reboot the testbed, save/restore any files it needs to (like the tests file tree or previously created artifacts), and then re-run the test with ADT_REBOOT_MARK=mymarker.

The new “Reboot during a test” section in README.package-tests explains this in detail with an example.

Implicit test metadata for similar packages

The Debian pkg-perl team recently discussed how to add package tests to the ~ 3.000 Perl packages. For most of these the test metadata looks pretty much the same, so they created a new pkg-perl-autopkgtest package which centralizes the logic. autopkgtest 3.5 now supports an implicit debian/tests/control control file to avoid having to modify several thousand packages with exactly the same file.

An initial run already looked quite promising, 65% of the packages pass their tests. There will be a few iterations to identify common failures and fix those in pkg-perl-autopkgtest and autopkgtestitself now.

There is still some discussion about how implicit test control files go together with the DEP-8 specification, as other runners like sadt do not support them yet. Most probably we’ll declare those packages XS-Testsuite: autopkgtest-pkg-perl instead of the usual autopkgtest.

In the same vein, Debian’s Ruby maintainer (Antonio Terceiro) added implicit test control support for Ruby packages. We haven’t done a mass test run with those yet, but their structure will probably look very similar.

on September 15, 2014 08:23 AM

Hi all,
after long time I return to write to show you how to create a simple game for Ubuntu for Phones (but also for Android) with Bacon2D.

Bacon2D is a framework to ease 2D game development, providing ready-to-use QML elements representing basic game entities needed by most of games.

As tutorial I’ll explain you how I create my first QML game, 100balls, that you could find on Ubuntu Store on Phones. Source is available on Github.

Installation

So, first of all we need to install Bacon2D on our system. I suppose you have already installed QT on your system, so we only need to take source and compile it:

git clone git@github.com:Bacon2D/Bacon2D.git
cd Bacon2D
mkdir build && cd build
qmake ..
make
sudo make install

Now you have Bacon2D on your system, and you can import it in every project you want.

A first look to Bacon2D

Bacon2D provides a good number of custom components for your app. Of course, I can’t describe them all in one article, so please read the documentation. We’ll use only few of them, and I think the best way to introduce you to them is writing the app.
So, let’s start!

First of all, we create our base file, called 100balls.qml:

import QtQuick 2.0
import Bacon2D 1.0

The first element we add is the Game element. Game is the top-level container, where all the game will be. We set some basic property and the name of the game, with gameName property:

import QtQuick 2.0
import Bacon2D 1.0
 
Game {
    id: game
    anchors.centerIn: parent
 
    height: 680
    width: 440
 
    gameName: "com.ubuntu.developer.rpadovani.100balls" // Ubuntu Touch name format, you can use whatever you want
}

But the Game itself is useless, we need to add one or more Scene to it. A scene is the place where all Entity of the game will be placed.
Scene has a lot of property, for now is importat to set two of them: running indicates if all things in the scene will move, and if game engine works; second property is physics, that indicates if Box2D has to be used to simulate physic in the game. We want a game where some balls fall, so we need to set it to true.

import QtQuick 2.0
import Bacon2D 1.0
 
Game {
    id: game
    anchors.centerIn: parent
 
    height: 680
    width: 440
 
    gameName: "com.ubuntu.developer.rpadovani.100balls" // Ubuntu Touch name format, you can use whatever you want
 
    Scene {
        id: gameScene
        physics: true
        running: true
    }
}
on September 15, 2014 07:00 AM

September 14, 2014

Brum Tech Scene interviews

Stuart Langridge

Today I released the first of the Brum Tech Scene interviews, with me talking to Simon Jenner of Silicon Canal and Oxygen Startups. There’s a video on the site from me explaining why I’m doing this, but I figure that the more discerning audience for as days pass by might appreciate a more in-depth discussion.

I love this city. I love that we’re prepared to spend a hundred and ninety million quid on building the best library in the whole world. I love that there’s so much going on, tech-wise. But nobody talks to anybody else. If you look at, say, Brighton, the whole tech scene there all hang out together. They can put on a Digital Brighton week and have dConstruct be part of it and Seb do mad things with visualisations and that’s marvellous. We ought to have that. I want us to have that.

We don’t have a tech scene. We’ve got twenty separate tech scenes. What I want to do is knock down the walls a bit. So the designers talk to the SEO people and the Linux geeks talk to the designers. Because there is no way that this can be a bad thing.

I also want to learn a bit about videos. Now, let’s be clear here. I know from a decade of podcasting that with a mild expenditure of money on gear, and a great sound engineer (Jono Bacon, step forward) you can produce something as good as the professionals. Bad Voltage sounds as good, production-wise, as the BBC’s Today programme does. Video is not like that. There is a substantial difference between amateur and professional efforts; one bloke using mobile phones to record cannot make something that looks like Sherlock or Game of Thrones. I’m not trying to look professional here; I’m aiming for “competent amateur”. I’ve learned loads about how to record a video interview, how to mix it, how to do the editing. Sit far enough apart that your voice doesn’t sound on their mic. Apply video effects to the clip before you cut it up. Don’t speak over the interviewee. KDEnLive’s “set audio reference” is witchcraft brilliance. I knew none of this two months ago. And I’ve really enjoyed learning. I am in no wise good at this stuff, but I’m better than I was.

This has been a fun project to set up, and it will continue being fun as I record more interviews. My plan is to have a new one every Monday morning, indefinitely, as long as people like them and I’m still interested in doing them. I should give big love to Mike, my designer, who I fought with tooth and nail about the site design and the desaturated blue look to the videos, and to Dan Newns who sat and was interviewed as a test when I first came up with this idea, and has provided invaluable feedback throughout.

If you know something about video editing, I’d love to hear how I can do better. Ping me on twitter or by mail. Tell me as well who you want to hear interviewed; which cool projects are going on that I don’t know about. I’d also love to hear about cool venues in the city in which I can do interviews; one of my subsidiary goals here is to show off the city’s tech places. Annoyingly, I spoke to the Library and to the Birmingham Museums Trust and they were all “fill out our fifteen page form” because they’re oriented around the BBC coming in with a crew of twenty camera people, not one ginger guy with a mobile phone and a dream. Maybe I’ll do things with @HubBirmingham once they actually exist.

I should talk about the tech, here. I record the interviews on an iPhone 5, a Nexus 4, and a little HD camera I bought years ago. The audio is done with two Røde Smartlav lapel mics plugged into the two phones. None of this is expensive, which has a cost in terms of video and audio quality but critically doesn’t have much of a cost in terms of actual pounds sterling. And editing is done with KDEnLive (kdenlive?) which is a really powerful non-linear video editor for Ubuntu, and the team who make it should be quite proud. The big thing I’m missing (apart from a cameraman) is a tripod, which I can probably buy for about ten quid, and I will do once I find one that’s tall and yet still fits in my laptop bag.

Anyway, that’s the story of the Brum Tech Scene interviews. There’ll be one every Monday. I hope you like them. I hope they help, even in a small way, to make the Brum tech scene gel together even more than it has thus far. Let me know what you think. brumtechscene.co.uk.

on September 14, 2014 11:56 PM

I’m quitting relinux

Joel Leclerc

I will start this off by saying: I’m very (and honestly) sorry for, well, everything.

To give a bit of history, I started relinux as a side-project for my CosmOS project (cloud-based distribution … which failed), in order to build the ISO’s. The only reasonable alternative at the time was remastersys, and I realized I would have to patch it anyways, so I thought that I might as well make a reusable tool for other distributions to use too.

Then came a rather large amount of friction between me and the author of remastersys, of which I will not go into any detail of. I acted very immaturely then, and wronged him several times. I had defamed him, made quite a few people very angry at him, and even managed to get some of his supporters against him. True, age and maturity had something to do with it (I was 12 at the time), but that still doesn’t excuse my actions at all.

So my first apology is to Tony Brijeski, the author of remastersys, for all the trouble and possible pain I had put him through. I’m truly sorry for all of this.

However, though the dynamics with Tony and remastersys are definitely a large part of why I’m quitting relinux, that is not all. The main reason, actually, is lack of interest. I have rewritten relinux a total of 7 times (including the original fork of remastersys), and I really hate the debugging process (takes 15-20 minutes to create an ISO, so that I can debug it). I have also lost interest in creating linux distributions, so not only am I very tired of working on it, I also don’t really care about what it does.

On this note, my second apologies (and thanks) have to go those who have helped me so much through the process, especially those who have tried to encourage me to finish relinux. Those listed are in no particular order, and if I forgot you, then let me know (and I apologize for that!):

  • Ko Ko Ye
  • Raja Genupula
  • Navdeep Sidhu
  • Members of the TSS Web Dev Club
  • Ali Hallahi
  • Gert van Spijker
  • Aritra Das
  • Diptarka Das
  • Alejandro Fernandez
  • Kendall Weaver

Thank you very much for everything you’ve done!

Lastly, I would like to explain my plans for it, in case anyone wants to continue it (by no means do I want to enforce these, these are just ideas).

My plan for the next release of relinux was to actually make a very generic and scriptable CLI ISO creation tool, and then make relinux as a specific set of “profiles” for that tool (plus an interface). The tool would basically contain a few libraries for the chosen scripting language, for things like storing the filesystem (SquashFS or other), ISO creation, and general utilities for editing files while keeping permissions, mutli-threading/processing, etc… The “profiles” would then copy, edit, and delete files as needed, set up the tool wanted for running the live system (in ubuntu’s case, this’d be casper), setup the installer/bootloader, and such.

I would like to apologize to you all, the people who have used relinux and have waited for a stable version for 3 years, for not doing this. Thank you very much for your support, and I’m very sorry for having constantly pushed releases back and having never made a stable or well working version of relinux. Though I do have some excuses as to why the releases didn’t work, or why I didn’t test them well enough, none of them can cover why I didn’t fix them or work on it more. And for that, I am very sorry.

I know that this is a very large post for something so simple, but I feel that it would not be right if I didn’t apologize to those I have done wrong to, and thanked those who have helped me along the way.

So to summarize, thank you, sorry, and relinux is now dead.

- Joel Leclerc (MiJyn)


on September 14, 2014 11:24 PM

Getting Started in CTFs

David Tomaschik

My last post was about getting started in a career in information security. This post is about the sport end of information security: Capture the Flag (CTFs).

I'd played around with some wargames (Smash the Stack, Over the Wire, and Hack this Site) before, but my first real CTF (timed, competitive, etc.) was the CTF run by Mad Security at BSides SF 2013. By some bizarre twist of fate, I ended up winning the CTF, and I was hooked. I've probably played in about 30 CTFs since, most of them online with the team Shadow Cats. It's been a bumpy ride, but I've learned a lot about a variety of topics by doing this.

If you're in the security industry and you've never tried a CTF, you really should. Personally, I love CTFs because they get me to exercise skills that I never get to use at work. They also inspire some of my research and learning. The only problem is making the time. :)

Here's some resources I've thought were interesting:

on September 14, 2014 08:07 PM

September 13, 2014

"Your release sucks."

Luke Faraone

I look forward to Ubuntu's semiannual release day, because it's the completion of 6ish months of work by Ubuntu (and by extension Debian) developers.

I also loathe it, because every single time we get people saying "This Ubuntu release is the worst release ever!".

Ubuntu releases are always rocky around release time, because the first time Ubuntu gets widespread testing is on or after release day.

We ship software to 12 Million Ubuntu Users with only 150 MOTUs who work directly on the platform. That's a little less than 1 developer with upload rights to the archive for every 60,000 users. ((This number, like all other usage data, is dated, and probably wasn't even accurate when it was first calculated)) Compared to Debian, which (at last estimate in 2010) had 1.5 million uniques on security.debian.org, yet has around 1000 Debian Developers.

Debian has a strong testing culture; someone once estimated that around ¾ of Debian users are running unstable or testing. In Ubuntu, we don't have good metrics on how many people are using the development release that I'm aware of (pointers welcome), but I'd guess that it's a very very small percentage. A common thread in bug reports, if we get a response at all, goes on as follows:
Triager: ((Developer, bugcontrol member, etc. Somebody who is not experiencing the problem but wants to help.)) "Is this a problem in $devel?"
User: "I'll let you know when it hits final"
Triager: "It's too late then. Then we'll want you to test in the next release. We have to fix it BEFORE its final"
User: "Ok, I'll test at beta."
Triager: "That's 2 weeks before release, which will be too late. Please test ASAP if you want us to have time to fix it"

Of course, there are really important bugs with hardware support which keep on cropping up. But if they're just getting reported on or around release day, there are limits to what can be done about them this cycle.

We need to make it easier for people to run early development versions, and encourage more people to use them (as long as they're willing to deal with breakage). I'm not sure whether unstable/testing is appropriate for Ubuntu, and I'm fairly confident that we don't want to move to a rolling release (currently being discussed in Debian, summary). But we badly need more developers, and equally importantly, more testers to try it out earlier in the release process.

To users: please, please try out the development versions. Download a LiveCD and run a smoketest, or check if bugs you reported are in fact fixed in the later versions. And do it early and often.
on September 13, 2014 08:43 PM

I've only been an information security practitioner for about a year now, but I've been doing things on my own for years before that. However, many people are just getting into security, and I've recently stumbled on a number of resources for newcomers, so I thought I'd put together a short list.

on September 13, 2014 07:30 PM

September 12, 2014

student chromebook

Are you are enrolled in college, need a laptop computer, and willing to accept a new Chromebook? If so, Google got a deal for you and it’s called the Google Lending Library.

The Chromebook Lending Library is traveling to 12 college campuses across the U.S. loaded with the latest Chromebooks. The Lending Library is a bit like your traditional library, but instead of books, we’re letting students borrow Chromebooks (no library card needed). Students can use a Chromebook during the week for life on campus— whether it’s in class, during an all-nighter, or browsing the internet in their dorm.

Lindsay Rumer, Chrome Marketing


Assuming you attend one the partnered Universities, here is how it works.

1. Request a Chromebook from the Library
2. Agree to the Terms of Use Agreement
3. Use the Chromebook as you like while you attend school
4. Return it when you want or when you leave

What happens if you don’t return it? Expect to receive a bill for the fair market value not to exceed $220.

Here’s the fine print.

“Evaluation Period” means the period of time specified to you at the time of checkout of a Device.

“Checkout Location” means the location specified by Google where Devices will be issued to you and collected from you.

1.1 Device Use. You may use the Device issued to you for your personal evaluation purposes. Upon your use of the Device, Google transfers title to the Device equipment to you, but retains all ownership rights, title and interest to any Google Devices and services and anything else that Google makes available to you, including without limitation any software on the Device.

1.2 Evaluation Period. You may use the Device during the Evaluation Period. Upon (i) expiration of the Evaluation Period, or (ii) termination of this Agreement, if this Agreement is terminated early in accordance Section 4, you agree to return the Device to the Checkout Location. If you fail to return the Device at the end of the Evaluation Period or upon termination of this Agreement, you agree Google may, to the extent allowed by applicable law, charge you up to the fair market value of the Device less normal wear and tear and any applicable taxes for an amount not to exceed Two Hundred Twenty ($220.00) Dollars USD.

1.3 Feedback. Google may ask you to provide feedback about the Device and related Google products optimized for Google Services. You are not required to provide feedback, but, if you do, it must only be from you, truthful, and accurate and you grant Google permission to use your name, logo and feedback in presentations and marketing materials regarding the Device. Your participation in providing feedback may be suspended at any time.

1.4 No Compensation. You will not be compensated for your use of the Devices or for your feedback.

2. Intellectual Property Rights. Nothing in this Agreement grants you intellectual property rights in the Devices or any other materials provided by Google. Except as provided in Section 1.1, Google will own all rights to anything you choose to submit under this Agreement. If that isn’t possible, then you agree to do whichever of the following that Google asks you to do: transfer all of your rights regarding your submissions to Google; give Google an exclusive, irrevocable, worldwide, royalty-free license to your submissions to Google; or grant Google any other reasonable rights. You will transfer your submissions to Google, and sign documents and provide support as requested by Google, and you appoint Google to act on your behalf to secure these rights from you. You waive any moral rights you have and agree not to exercise them, unless you notify Google and follow Google’s instructions.

3. Confidentiality. Your feedback and other submissions, is confidential subject to Google’s use of your feedback pursuant to Section 1.3.

4. Term. This Agreement becomes effective when you click the “I Agree” button and remains in force through the end of the Evaluation Period or earlier if either party gives written termination notice, which will be effective immediately. Upon expiration or termination, you will return the Device as set forth below. Additionally, Google will remove you from any related mailing lists within thirty (30) days of expiration or termination. Sections 1.3, 1.4, and Sections 2 through 5 survive any expiration or termination of this Agreement.

5. Device Returns. You will return the Device(s) to Google or its agents to the Checkout Location at the time specified to you at the time of checkout of the Device or if unavailable, to Google Chromebook Lending Library, 1600 Amphitheatre Parkway, Mountain View, CA 94043. Google may notify you during or after the term of this Agreement regarding return details or fees chargeable to you if you fail to return the Device.

The post Get a free Chromebook from the Google Lending Library appeared first on john's journal.

on September 12, 2014 10:52 PM

Dyn's free dynamic DNS service closed on Wednesday, May 7th, 2014.

CloudFlare, however, has a little known feature that will allow you to update
your DNS records via API or a command line script called ddclient. This will
give you the same result, and it's also free.

Unfortunately, ddclient does not work with CloudFlare out of the box. There is
a patch available
and here is how to hack[1] it up on Debian or Ubuntu, also works in Raspbian with Raspberry Pi.

Requirements

basic command line skills, and a domain name
that you own.

CloudFlare

Sign up to CloudFlare and add your domain name.
Follow the instructions, the default values it gives should be fine.

You'll be letting CloudFlare host your domain so you need to adjust the
settings at your registrar.

If you'd like to use a subdomain, add an 'A' record for it. Any IP address
will do for now.

Let's get to business...

Installation

$ sudo apt-get install ddclient

Patch

$ sudo apt-get install curl sendmail libjson-any-perl libio-socket-ssl-perl
$ curl -O http://blog.peter-r.co.uk/uploads/ddclient-3.8.0-cloudflare-22-6-2014.patch 
$ sudo patch /usr/sbin/ddclient < ddclient-3.8.0-cloudflare-22-6-2014.patch

Config

$ sudo vi /etc/ddclient.conf

Add:

##
### CloudFlare (cloudflare.com)
###
ssl=yes
use=web, web=dyndns
protocol=cloudflare, \
server=www.cloudflare.com, \
zone=domain.com, \
login=you@email.com, \
password=api-key \
host.domain.com

Comment out:

#daemon=300

Your api-key comes from the account page

ssl=yes might already be in that file

use=web, web=dyndns will use dyndns to check IP (useful for NAT)

You're done. Log in to https://www.cloudflare.com and check that the IP listed for
your domain matches http://checkip.dyndns.com

To verify your settings:

sudo ddclient -daemon=0 -debug -verbose -noquiet

Fork this:
https://gist.github.com/ayr-ton/f6db56f15ab083ab6b55

on September 12, 2014 06:47 PM

My Family…

Harald Sitter

… is the best in the whole wide world!
akademy2014

on September 12, 2014 03:33 PM

S07E24 – The One with the Holiday Armadillo

Ubuntu Podcast from the UK LoCo

We’re back with Season Seven, Episode Twenty-Four of the Ubuntu Podcast! Alan Pope, Mark Johnson, and Laura Cowen are drinking tea and eating Battenburg cake in Studio L.

In this week’s show:

  • We discuss whether communities suck…

  • We also discuss:

    • Aurasma augmented reality
    • Upgrading to 14.10
    • Converting a family member to Ubuntu
  • We share some Command Line Lurve which does this (from Patrick Archibald on G+):
      curl -X POST -H "Content-Type: application/json" -d '{"jsonrpc":"2.0","method":"GUI.ShowNotification","params":{"title":"This is the title of the message","message":"This is the body of the message"},"id":1}' http://wopr.local:8080/jsonrpc
    
  • And we read your feedback. Thanks for sending it in!

We’ll be back next week, so please send your comments and suggestions to: podcast@ubuntu-uk.org
Join us on IRC in #uupc on Freenode
Leave a voicemail via phone: +44 (0) 203 298 1600, sip: podcast@sip.ubuntu-uk.org and skype: ubuntuukpodcast
Follow us on Twitter
Find our Facebook Fan Page
Follow us on Google+

on September 12, 2014 01:05 PM

September 11, 2014

Off to Berlin

Benjamin Kerensa

Right now, as this post is published, I’m probably settling into my seat for the next ten hours headed to Berlin, Germany as part of a group of leaders at Mozilla who will be meeting for ReMo Camp. This is my first transatlantic trip ever and perhaps my longest flight so far, so I’m both […]
on September 11, 2014 08:45 PM

Akademy Poll

Jonathan Riddell

KDE Project:

on September 11, 2014 08:42 PM

KDE Project:

DSC_0769
Hacking hard in the hacking room

DSC_0773
Blue Systems Beer

DSC_0775
You will keep GStreamer support in Phonon

DSC_0780
Boat trip on the loch

DSC_0781
Off the ferry

DSC_0783
Bushan leads the way

DSC_0784
A fairy castle appears in the distance

DSC_0787
The talent show judges

DSC_0790
Sinny models our stylish Kubuntu polo shirts

DSC_0793
Kubuntu Day discussions with developers from the Munich Kubuntu rollout

IMG 9510 v1
Kubuntu Day group photo with people from Kubuntu, KDE, Debian, Limux and Net-runner

c IMG 8903 v1
Jonathan gets a messiah complex

on September 11, 2014 08:34 PM

On Wearable Technology

Benjamin Kerensa

The Web has been filled with buzz of the news of new Android watches and the new Apple Watch but I’m still skeptical as to whether these first iterations of Smartwatches will have the kind of sales Apple and Google are hoping for. I do think wearable tech is the future. In fact, I owned […]
on September 11, 2014 12:00 PM

Accessible KDE, Kubuntu

Valorie Zimmerman

KDE is community. We welcome everyone, and make our software work for everyone. So, accessibility is central to all our work, in the community, in testing, coding, documentation. Frederik has been working to make this true in Qt and in KDE for many years, Peter has done valuable work with Simon and Jose is doing testing and some patches to fix stuff.

However, now that KF5 is rolling out, we're finding a few problems with our KDE software such as widgets, KDE configuration modules (kcm) and even websites. However, the a11y team is too small to handle all this! Obviously, we need to grow the team.

So we've decided to make heavier use of the forums, where we might find new testers and folks to fix the problems, and perhaps even people to fix up the https://accessibility.kde.org/ website to be as
awesome as the KDE-Edu site. The Visual Design Group are the leaders here, and they are awesome!

Please drop by #kde-accessibility on Freenode or the Forum https://forum.kde.org/viewforum.php?f=216 to read up on what needs doing, and learn how to test. People stepping up to learn forum
moderation are also welcome. Frederik has recently posted about the BoF: https://forum.kde.org/viewtopic.php?f=216&t=122808

A11y was a topic in the Kubuntu BoF today, and we're going to make a new push to make sure our accessibility options work well out of the box, i.e. from first boot. This will involve working with the Ubuntu a11y team, yeah!

More information is available at
https://community.kde.org/Accessibility and
https://userbase.kde.org/Applications/Accessibility
on September 11, 2014 10:31 AM

Canonical and Ubuntu at dConstruct

Canonical Design Team

Brighton is not just a lovely seaside town, mostly known for being overcrowded in Summer by Londoners in search for a bit of escapism, but also the home of a thriving community of designers, makers and entrepreneurs. Some of these people run dConstruct, a gathering where creative minds of all sorts converge every year to discuss important themes around digital innovation and culture.

When I found out that we were sponsoring the conference this year, I promptly jumped in to help my colleagues in the Phone, Web and Juju design teams. Our stand was situated in the foyer of the Brighton Dome, flashing the orange banner of Ubuntu and a number of origami unicorns.

The Ubuntu Stand

Origami Unicorns

We had an incredibly positive response from the attendees, as our stand was literally teeming with Ubuntu enthusiasts who were really keen to check our progress with the phone. We had a few BQ phones on display where we showed the new features and designs.

Testing the phone

For us, it was a great occasion to gather fresh impressions of the user experience on the phone and across a variety of apps. After a few moments, people started to understand the edge interactions and began to swipe left and right, giving positive feedback on the responsiveness of the UI. Our pre-release models of BQ phones don’t have the final shell and they still display softkeys, as a result some people found this confusing. We took the opportunity to quickly design our own custom BQ phone by using a bunch of Ubuntu stickers…and viola, problem solved! ;)

Ubuntu phone - customised

Our ‘Make your Unicorn’ competition had a fantastic response. To celebrate the coming release of Utopic Unicorn and of the BQ phone, the maker of the best origami unicorn being awarded a new phone. The crowd did not hesitate to tackle the complex paper-bending challenge and came up with a bunch of creative outcomes. We were very impressed to see how many people managed to complete the instructions, as I didn’t manage to go beyond step 15..

Ubuntu fans

Twitter   Search - #dconstruct #ubuntu

on September 11, 2014 09:57 AM

I read an interesting article on OMG! Ubuntu! about whether Canonical will enter the wearables business, now the smartwatch industry is hotting up.

On one hand (pun intended), it makes sense. Ubuntu is all about convergence; a core platform from top to bottom that adjusts and expands across different form factors, with a core developer platform, and a focus on content.

On the other hand (pun still intended), the wearables market is another complex economy, that is heavily tethered, both technically and strategically, to existing markets and devices. If we think success in the phone market is complex, success in the burgeoning wearables market is going to be just as complex too.

Now, to be clear, I have no idea whether Canonical is planning on entering the wearables market or not. It wouldn’t surprise me if this is a market of interest though as the investment in Ubuntu over the last few years has been in building a platform that could ultimately scale. It is logical to think this could map to a smartwatch as “another form factor”.

So, if technically it is doable, Canonical should do it, right?

No.

I want to see my friends and former colleagues at Canonical succeed, and this needs focus.

Great companies focus on doing a small set of things and doing them well. Spiraling off in a hundred different directions means dividing teams, dividing focus, and limiting opportunities. To use a tired saying…being a “jack of all trades and master of none”.

While all companies can be tempted in this direction, I am happy that on the client side of Canonical, the focus has been firmly placed on phone. TV has taken a back seat, tablet has taken a back seat. The focus has been on building a featureful, high-quality platform that is focused on phone, and bringing that product to market.

I would hate to think that Canonical would get distracted internally by chasing the smartwatch market while it is young. I believe it would do little but direct resources away from the major push now, which is phone.

If there is something we can learn from Apple here is that it isn’t important to be first. It is important to be the best. Apple rarely ships the first innovation, but they consistently knock it out of the park by building brilliant products that become best in class.

So, I have no doubt that the exciting new convergent future of Ubuntu could run on a watch, but lets keep our heads down and get the phone out there and rocking, and the wearables and other form factors can come later.

on September 11, 2014 05:11 AM

September 10, 2014

The Open Source movement has evolved into other areas of computering.  Open Data, Open Hardware, and ,the topic that I want to talk about, Open Science, are three examples of this.  Since I’m a biologist, I’m deeply connected to the science community but I want to also tie in my hobby of FOSS/Linux into my work.  There are many non-coding (and coding) based things and groups that one can use for research and I want to talk about a few of them.

Mozilla Science Lab

Mozilla, the creators of Firefox and Thunderbird, started a group last year that aims to help scientists, “to use the power of the open web to change the way science is done. [They] build educational resources, tools and prototypes for the research community to make science more open, collaborative and efficient.” (main page of Mozilla Science Lab).

Right now, they are are focusing on teaching scientists the basic skills in research via the Software Carpentry project.  But I know that they are planning to get some projects for the community-building side for non-coders.  I don’t know what those projects are but I know that they will be listed soon on the mailing-list of the group.  For myself, I can’t wait until I get my hands on those projects to help them grow.

Open Science Framework

Another fairly new project within the last two years that was started by Center of Open Science that focuses on creating a framework that allows scientists to use the, “entire research lifecycle: planning, execution, reporting, archiving, and discovery”, (main page of OSF) fully and be able to share that with other people in there teams but thy could be in another place not near the head researcher.

I think this is one of the best tools out there because it allows you to upload things on the site and also from Dropbox and other services.  I played around with it a bit but I have not fully used it, but when I do, I will write a post about it.

Open Notebook Science

This is maybe one of the oldest projects that I think there is for Open Science and it’s Open Notebook Science.  It’s the idea of have the lab notebook publicly available online.  There is a small network of these.

I think, along with the OSF project, it is one of the best tools out there mainly because the data and other stuff is publicly available online for everyone to learn from your mistakes or to work with the data.

Hopefully as the time goes by, these projects will grow and researchers can collaborate better.

 


on September 10, 2014 10:39 PM

This little snippet of ~200 lines of YAML is the exact OpenStack that I'm deploying tonight, at the OpenStack Austin Meetup.

Anyone with a working Juju and MAAS setup, and 7 registered servers should be able to deploy this same OpenStack setup, in about 12 minutes, with a single command.


$ wget http://people.canonical.com/~kirkland/icehouseOB.yaml
$ juju-deployer -c icehouseOB.yaml
$ cat icehouseOB.yaml

icehouse:
overrides:
openstack-origin: "cloud:trusty-icehouse"
source: "distro"
services:
ceph:
charm: "cs:trusty/ceph-27"
num_units: 3
constraints: tags=physical
options:
fsid: "9e7aac42-4bf4-11e3-b4b7-5254006a039c"
"monitor-secret": AQAAvoJSOAv/NRAAgvXP8d7iXN7lWYbvDZzm2Q==
"osd-devices": "/srv"
"osd-reformat": "yes"
annotations:
"gui-x": "2648.6688842773438"
"gui-y": "708.3873901367188"
keystone:
charm: "cs:trusty/keystone-5"
num_units: 1
constraints: tags=physical
options:
"admin-password": "admin"
"admin-token": "admin"
annotations:
"gui-x": "2013.905517578125"
"gui-y": "75.58013916015625"
"nova-compute":
charm: "cs:trusty/nova-compute-3"
num_units: 3
constraints: tags=physical
to: [ceph=0, ceph=1, ceph=2]
options:
"flat-interface": eth0
annotations:
"gui-x": "776.1040649414062"
"gui-y": "-81.22811031341553"
"neutron-gateway":
charm: "cs:trusty/quantum-gateway-3"
num_units: 1
constraints: tags=virtual
options:
ext-port: eth1
instance-mtu: 1400
annotations:
"gui-x": "329.0572509765625"
"gui-y": "46.4658203125"
"nova-cloud-controller":
charm: "cs:trusty/nova-cloud-controller-41"
num_units: 1
constraints: tags=physical
options:
"network-manager": Neutron
annotations:
"gui-x": "1388.40185546875"
"gui-y": "-118.01156234741211"
rabbitmq:
charm: "cs:trusty/rabbitmq-server-4"
num_units: 1
to: mysql
annotations:
"gui-x": "633.8120727539062"
"gui-y": "862.6530151367188"
glance:
charm: "cs:trusty/glance-3"
num_units: 1
to: nova-cloud-controller
annotations:
"gui-x": "1147.3269653320312"
"gui-y": "1389.5643157958984"
cinder:
charm: "cs:trusty/cinder-4"
num_units: 1
to: nova-cloud-controller
options:
"block-device": none
annotations:
"gui-x": "1752.32568359375"
"gui-y": "1365.716194152832"
"ceph-radosgw":
charm: "cs:trusty/ceph-radosgw-3"
num_units: 1
to: nova-cloud-controller
annotations:
"gui-x": "2216.68212890625"
"gui-y": "697.16796875"
cinder-ceph:
charm: "cs:trusty/cinder-ceph-1"
num_units: 0
annotations:
"gui-x": "2257.5515747070312"
"gui-y": "1231.2130126953125"
"openstack-dashboard":
charm: "cs:trusty/openstack-dashboard-4"
num_units: 1
to: "keystone"
options:
webroot: "/"
annotations:
"gui-x": "2353.6898193359375"
"gui-y": "-94.2642593383789"
mysql:
charm: "cs:trusty/mysql-1"
num_units: 1
constraints: tags=physical
options:
"dataset-size": "20%"
annotations:
"gui-x": "364.4567565917969"
"gui-y": "1067.5167846679688"
mongodb:
charm: "cs:trusty/mongodb-0"
num_units: 1
constraints: tags=physical
annotations:
"gui-x": "-70.0399979352951"
"gui-y": "1282.8224487304688"
ceilometer:
charm: "cs:trusty/ceilometer-0"
num_units: 1
to: mongodb
annotations:
"gui-x": "-78.13333225250244"
"gui-y": "919.3128051757812"
ceilometer-agent:
charm: "cs:trusty/ceilometer-agent-0"
num_units: 0
annotations:
"gui-x": "-90.9158582687378"
"gui-y": "562.5347595214844"
heat:
charm: "cs:trusty/heat-0"
num_units: 1
to: mongodb
annotations:
"gui-x": "494.94012451171875"
"gui-y": "1363.6024169921875"
ntp:
charm: "cs:trusty/ntp-4"
num_units: 0
annotations:
"gui-x": "-104.57728099822998"
"gui-y": "294.6641273498535"
relations:
- - "keystone:shared-db"
- "mysql:shared-db"
- - "nova-cloud-controller:shared-db"
- "mysql:shared-db"
- - "nova-cloud-controller:amqp"
- "rabbitmq:amqp"
- - "nova-cloud-controller:image-service"
- "glance:image-service"
- - "nova-cloud-controller:identity-service"
- "keystone:identity-service"
- - "glance:shared-db"
- "mysql:shared-db"
- - "glance:identity-service"
- "keystone:identity-service"
- - "cinder:shared-db"
- "mysql:shared-db"
- - "cinder:amqp"
- "rabbitmq:amqp"
- - "cinder:cinder-volume-service"
- "nova-cloud-controller:cinder-volume-service"
- - "cinder:identity-service"
- "keystone:identity-service"
- - "neutron-gateway:shared-db"
- "mysql:shared-db"
- - "neutron-gateway:amqp"
- "rabbitmq:amqp"
- - "neutron-gateway:quantum-network-service"
- "nova-cloud-controller:quantum-network-service"
- - "openstack-dashboard:identity-service"
- "keystone:identity-service"
- - "nova-compute:shared-db"
- "mysql:shared-db"
- - "nova-compute:amqp"
- "rabbitmq:amqp"
- - "nova-compute:image-service"
- "glance:image-service"
- - "nova-compute:cloud-compute"
- "nova-cloud-controller:cloud-compute"
- - "cinder:storage-backend"
- "cinder-ceph:storage-backend"
- - "ceph:client"
- "cinder-ceph:ceph"
- - "ceph:client"
- "nova-compute:ceph"
- - "ceph:client"
- "glance:ceph"
- - "ceilometer:identity-service"
- "keystone:identity-service"
- - "ceilometer:amqp"
- "rabbitmq:amqp"
- - "ceilometer:shared-db"
- "mongodb:database"
- - "ceilometer-agent:container"
- "nova-compute:juju-info"
- - "ceilometer-agent:ceilometer-service"
- "ceilometer:ceilometer-service"
- - "heat:shared-db"
- "mysql:shared-db"
- - "heat:identity-service"
- "keystone:identity-service"
- - "heat:amqp"
- "rabbitmq:amqp"
- - "ceph-radosgw:mon"
- "ceph:radosgw"
- - "ceph-radosgw:identity-service"
- "keystone:identity-service"
- - "ntp:juju-info"
- "neutron-gateway:juju-info"
- - "ntp:juju-info"
- "ceph:juju-info"
- - "ntp:juju-info"
- "keystone:juju-info"
- - "ntp:juju-info"
- "nova-compute:juju-info"
- - "ntp:juju-info"
- "nova-cloud-controller:juju-info"
- - "ntp:juju-info"
- "rabbitmq:juju-info"
- - "ntp:juju-info"
- "glance:juju-info"
- - "ntp:juju-info"
- "cinder:juju-info"
- - "ntp:juju-info"
- "ceph-radosgw:juju-info"
- - "ntp:juju-info"
- "openstack-dashboard:juju-info"
- - "ntp:juju-info"
- "mysql:juju-info"
- - "ntp:juju-info"
- "mongodb:juju-info"
- - "ntp:juju-info"
- "ceilometer:juju-info"
- - "ntp:juju-info"
- "heat:juju-info"
series: trusty

:-Dustin
on September 10, 2014 09:54 PM

The start of the jessie freeze is quickly approaching, so now is a good time to ensure that packages you rely on will the part of the upcoming release. Thanks to automated removals, the number of release critical bugs has been kept low, but this was achieved by removing many packages from jessie: 841 packages from unstable are not part of jessie, and won’t be part of the release if things don’t change.

It is actually simple to check if you have packages installed locally that are part of those 841 packages:

  1. apt-get install how-can-i-help (available in backports if you don’t use testing or unstable)
  2. how-can-i-help --old
  3. Look at packages listed under Packages removed from Debian ‘testing’ and Packages going to be removed from Debian ‘testing’

Then, please fix all the bugs :-) Seriously, not all RC bugs are hard to fix. A good starting point to understand why a package is not part of jessie is tracker.d.o.

On my laptop, the two packages that are not part of jessie are the geeqie image viewer (which looks likely to be fixed in time), and josm, the OpenStreetMap editor, due to three RC bugs. It seems much harder to fix… If you fix it in time for jessie, I’ll offer you a $drink!

on September 10, 2014 07:28 PM

Last week, we announced our "Ubuntu Loves Developers" effort! We got some great feedback and coverage. Multiple questions arose around how to help and be part of this effort. Here is the post to answer about this :)

Our philosophy

First, let's define the core principles around the Ubuntu Developer Tools Center and what we are trying to achieve with this:

  1. UDTC will always download, tests and support the latest available upstream developer stack. No version stuck in stone for 5 years, we get the latest and the best release that upstream delivers to all of us. We are conscious that being able to develop on a freshly updated environment is one of the core values of the developer audience and that's why we want to deliver that experience.
  2. We know that developers want stability overall and not have to upgrade or spend time maintaining their machine every 6 months. We agree they shouldn't have to and the platform should "get out of my way, I've got work to do". That's the reason why we focus heavily on the latest LTS release of Ubuntu. All tools will always be backported and supported on the latest Long Term Support release. Tests are running multiple times a day on this platform. In addition to this, we support, of course, the latest available Ubuntu Release for developers who likes to live on the edge!
  3. We want to ensure that the supported developer environment is always functional. Indeed, by always downloading latest version from upstream, the software stack can change its requirements, requiring newer or extra libraries and thus break. That's why we are running a whole suite of functional tests multiple times a day, on both version that you can find in distro and latest trunk. That way we know if:
  • we broke ourself in trunk and needs to fix it before releasing.
  • the platform broke one of the developer stack and we can promptly fix it.
  • a third-party application or a website changed and broke the integration. We can then fix this really early on.

All those tests running will ensure the best experience we can deliver, while fetching always latest released version from upstream, and all this, on a very stable platform!

Sounds cool, how can I help?

Reports bugs and propose enhancements

The more direct way of reporting a bug or giving any suggestions is through the upstream bug tracker. Of course, you can always reach us out as well on social networks like g+, through the comments section of this blog, or on IRC: #ubuntu-desktop, on freenode. We are also starting to look at the #ubuntulovesdevs hashtag.

The tool is really to help developers, so do not hesitate to help us directing the Ubuntu Developer Tools Center on the way which is the best for you. :)

Help translating

We already had some good translations contributions through launchpad! Thanks to all our translators, we got Basque, Chinese (Hong Kong), Chinese (Simplified), French, Italian and Spanish! There are only few strings up for translations in udtc and it should take less than half an hour in total to add a new one. It's a very good and useful way to contribute for people speaking other languages than English! We do look at them and merge them in the mainline automatically.

Contribute on the code itself

Some people started to offer code contribution and that's a very good and motivating news. Do not hesitate to fork us on the upstream github repo. We'll ensure we keep up to date on all code contributions and pull requests. If you have any questions or for better coordination, open a bug to start the discussion around your awesome idea. We'll try to be around and guide you on how to add any framework support! You will not be alone!

Write some documentation

We have some basic user documentation. If you feel there are any gaps or any missing news, feel free to edit the wiki page! You can as well merge some of the documentation of the README.md file or propose some enhancements to it!

To give an easy start to any developers who wants to hack on udtc iitself, we try to keep the README.md file readable and up to the current code content. However, this one can deviate a little bit, if you think that any part missing/explanation requires, you can propose any modifications to it to help future hackers having an easier start. :)

Spread the word!

Finally, spreading the word that Ubuntu Loves Developers and we mean it! Talk about it on social network, tagging with #ubuntulovesdevs or in blog posts, or just chatting to your local community! We deeply care about our developer audience on the Ubuntu Desktop and Server and we want this to be known!

uld.png

For more information and hopefully goodness, we'll have an ubuntu on air session session soon! We'll keep you posted on this blog when we have final dates details.

If you felt that I forgot to mention anything, do not hesitate to signal it as well, this is another form of very welcome contributions! ;)

I'll discuss next week how we maintain and runs tests to ensure your developer tools are always working and supported!

on September 10, 2014 02:16 PM

When we setup Freexian’s offer to bring together funding from multiple companies in order to sponsor the work of multiple developers on Debian LTS, one of the rules that I imposed is that all paid contributors must provide a public monthly report of their paid work.

While the LTS project officially started in June, the first month where contributors were actually paid has been July. Freexian sponsored Thorsten Alteholz and Holger Levsen for 10.5 hours each in July and for 16.5 hours each in August. Here are their reports:

It’s worth noting that Freexian sponsored Holger’s work to fix the security tracker to support squeeze-lts. It’s my belief that using the money of our sponsors to make it easier for everybody to contribute to Debian LTS is money well spent.

As evidenced by the progress bar on Freexian’s offer page, we have not yet reached our minimal goal of funding the equivalent of a half-time position. And it shows in the results, the dla-needed.txt still shows around 30 open issues. This is slightly better than the state two months ago but we can improve a lot on the average time to push out a security update…

To have an idea of the relative importance of the contributions of the paid developers, I counted the number of uploads made by Thorsten and Holger since July: of 40 updates, they took care of 19 of them, so about the half.

I also looked at the other contributors: Raphaël Geissert stands out with 9 updates (I believe that he is contracted by Électricité de France for doing this) and most of the other contributors look like regular Debian maintainers taking care of their own packages (Paul Gevers with cacti, Christoph Berg with postgresql, Peter Palfrader with tor, Didier Raboud with cups, Kurt Roeckx with openssl, Balint Reczey with wireshark) except Matt Palmer and Luciano Bello who (likely) are benevolent members of the LTS team.

There are multiple things to learn here:

  1. Paid contributors already handle almost 70% of the updates. Counting only on volunteers would not have worked.
  2. Quite a few companies that promised help (and got mentioned in the press release) have not delivered the promised help yet (neither through Freexian nor directly).

Last but not least, this project wouldn’t exist without the support of multiple companies and organizations. Many thanks to them:

Hopefully this list will expand over time! Any help to reach out to new companies and organizations is more than welcome.

One comment | Liked this article? Click here. | My blog is Flattr-enabled.

on September 10, 2014 11:30 AM

Grantlee based on Qt 5

Stephen Kelly

While I’ve been off the grid I received an unusually high concentration of requests for a Qt5-based Grantlee. I’m working on getting that done for September.


on September 10, 2014 10:08 AM

I Did it My Way

Stephen Kelly

I flew to Bilbao on the 11th of August and took a train to Burgos. From there I walked 500km over 21 days to arrive in Santiago. Although technically a Catholic pilgrimage, I approached it secularly as a walking holiday, a different environment while in-between jobs and some time for myself. Everyone walks their own Camino.

First Insight: There is suffering

On arrival in Burgos, I needed to first get a ‘Credential’, which allows sleeping in the albergues for Peregrinos along the route, and records qualification to recieve a Compostela at the end of the journey. A credential may be obtained from the municipal albergue in Burgos so that was my first stop. It was almost 8pm when I arrived and the Hospitalero had been telling people since 4pm that the albergue was full. However, he kept a spare bed for such late arrivals and luckily he gave it to me. It made for a good start.

Many sunflowers on the Meseta

Many sunflowers on the Meseta

Stated in a deliberately impersonal way, the nature of suffering on the Camino was immediately apparent when I arrived in Burgos. For most people (with sufficient time), The Camino starts in St. Jean Pied de Port in France from where pilgrims walk over the Pyrenees, and through Pamplona before arriving in Burgos two weeks later. Colloqially, this first of three stages of the walk is known as the ‘Camino of Suffering’, where the pilgrim starts the challenge and experience. I spent the first evening with a few people who had developed the blisters and the tendonitis, people who were bidding farewell to the journey already (some people do it over multiple years), and people who were bidding farewell to companions.

In the morning I got up at 6am and started walking in the dark out of Burgos, following the centuries-old route along the second stage – the ‘Camino of Death’, so called because of the barren, flat landscape surroundings until reaching Astorga.

Initially I walked with one Italian guy and we soon caught up with two more Italians. Most of the time walking the Camino is spent walking alone though, so that little group quickly dissolved as people broke away or fell behind (ahem!). A social shock I experienced is that when someone in a group stops or slows (even one more-familiar than a few hours/days), the others simply continue on – they’re sure to be re-united at a break-point or end point later along the way. It’s something to get used to.

My shadow's the only one that walks beside me...

My shadow’s the only one that walks beside me… I have tonnes of photos like this. Depending on whether I like you, I might sit you down to show you all of them!

Second Insight: Suffering should be understood

One of the things I learned on the Camino is that ‘normal’ is partly an environmental concept. It became ‘normal’ for me to get up at 6am, walk 20-30km per day, eat with some strangers and some familiar faces and go to sleep at 10pm. It did take about a week for that to become ‘normal’ (and even enjoyable!), but it is certainly not similar to my Berlin experience.

Dawn

Dawn

I had blisters since the first day of walking, but by the time I reached Bercianos I was no longer able to stand, let alone walk. I had a day off followed by a 7km walk to the next town where I happened to meet a foot doctor from Berlin who gave me all the help I needed, including her sandals. My footwear was the problem causing my blisters (I was walking in running shoes), so I bought myself a good, expensive pair of sandals when I got to Leon, gave the good doctor hers back and had no new problems caused by footwear for the next two weeks. What are the odds of a foot doctor having the same size feet as me?

Chica the dog

Chica the dog

Most of the people I encountered on the Camino were Italians, Spanish a close second, and plenty of Germans. I fell into a good rhythm with a group of Germans for the second two weeks which was nice. We spoke German as our common language.

camino-5

Third Insight: Suffering (of my Camino) has been understood

Astorga is a beautiful town and it marks the transition of the peregrino from the ‘Camino of Death’ into the ‘Camino of Life’. The route through Galicia is much more steep than the previous stages and full of the sights, sounds and smells of dairy farms. Most of the milk produced in Spain is produced here. The sunflowers filling the landscape and the trail are long since gone.

Although the blisters did not return, the steep climbs and descents brought with them some tendonitis for the final two days of walking. Not much to complain about, timing wise.

It’s sad that the experience of walking the Camino can’t be captured by any camera or prose, but must be experienced to be understood. That’s just part of the nature of suffering :).

camino-6

Like many, I continued on to Finisterre for a few days of sleeping on the beach and unlike most I spent the weekend after in Barcelona, for a very different experience of parks and beaches.


on September 10, 2014 10:04 AM
The results are in!

I'd like to start with addressing some of the issues with running this type of contest. We try to communicate the rules for this contest as clearly as possible, for example, you cannot even join the Flickr group without accepting the rules. However, we cannot force people to read terms and conditions, so we try to warn contestants if they aren't following the rules in order for them to correct their submission before deadline. Unfortunately we cannot force people to read their emails either, so we always have a few popular submissions that get disqualified. This contest was no different, we've had a few problems with wallpaper sizes and of course --- licensing issues. 

With that being said, full contest results can be seen here, and down below are the five wallpapers that will be included in Lubuntu 14.10, a big congratulations to you all:

 Sunset Over Lake by Andrei Daniel Ticlean
 Turn Back by Kari Wagner
 Void by Marxco
 DragonFly4 by Earl Lunt
 Colori D'autunno by Quellicol1000

I would also like to thank our friend Guillaume at Picompete for helping us host this contest. He's been helping us for years now, and we really appreciate everything he's done for us. It's a trustworthy service that we really recommend if you wish to host similar contests like this. Guillaume is also a big fan of open source and Linux, and he has offered to help other distributions if they ever need to host a poll.


http://picompete.com/contest/1441/lubuntu-14-10-community-wallpaper-contest/



on September 10, 2014 07:13 AM

Memebook, available in the Ubuntu phone storeLast week I published a new app to the Ubuntu Store, which isn’t anything particularly new, but this time I didn’t use my normal license, instead I went permissive. This is something I’ve been wavering on for a while now, and is the result of some developer soul-searching about why I’ve been using the GPL and what it’s done for me.

Free as in mine

In the past I’ve always used the GPL or LGPL, not for philosophical reasons or because I thought software should be free (in the RMS sense), but because I was selfish. I used the GPL because I wanted to make sure nobody built something on top of my work without sharing it back to me. In my mind, using a strong copy-left license ensured I couldn’t be left out of someone else’s success with my project. And in a way it worked, I wasn’t left out, because most people never used, let alone built on, my projects. I was trying to solve a problem I didn’t actually have.

You aren’t gonna need it

YAGNI (You aren’t gonna need it) is a principle of extreme programming that says you shouldn’t add features to a project until you know that it’s actually necessary.  I don’t usually pay much mind to trendy programming methods, but I think this one might be applicable to the way I pick licenses. If my project aren’t being used and extended by others, why am I worried about it happening enough that I want to put restrictions on it? Maybe I don’t need the GPL’s protections afterall.

A new direction

So from now on I’m going to prefer the BSD for new projects, and I’ll work on converting old ones to this license when I’ve been the only contributor. The worst that can happen is that somebody benefits from my code more than me, but that wouldn’t be much different to me than having nobody benefit from it more than me. I won’t actually lose anything. Nor will I be restricting my future options, on the contrary I can always go from a BSD to a GPL, but going the other direction is quite a bit harder once you accept contributions.

on September 10, 2014 02:41 AM