July 06, 2015

tl;dr; You can now use python 3 version of launchpadlib from ppa:zyga/launchpadlib-backports. Read on if you want to find out how I did it.

Today a colleague of mine asked for a small hint on how to do something with launchpadlib. I asked for the code sample and immediately stopped, seeing this is python 2 code. As python 2 is really not the way to start anything new in Ubuntu nowadays I looked at what's stopping us from using python 3.

It turns out, my colleague was using the LTS release (14.04) and python 3 version of launchpadlib just wasn't available at that time. Bummer.

Having a quick look at the dependencies I decided to help everyone out and create a PPA with the backported packages. Since this is a common process I though I would share my approach to both let others know and give more knowledgeable Debian developers a chance to correct me if I'm wrong.

The whole process starts with getting the source of the package you want to build. I wanted to get the latest and greatest packages so I grabbed the source package from Ubuntu 15.10 (wily). To do that I just go to packages.ubuntu.com and search for the right package. Here I wanted the python3-launchpadlib package. On the right-hand-side you can see the link to the .dsc file. You want that link so copy it now.

The right way to download each Debian source package is to use dget. Using a temporary directory as a workspace execute this command (if you read this later, the source package may not be available any more, you'd have to adjust the version numbers to reproduce the process).

dget http://archive.ubuntu.com/ubuntu/pool/main/p/python-launchpadlib/python-launchpadlib_1.10.3-2.dsc

With that package unpacked you want to change into the sub-directory with the unpackaged code. At this stage, you need to have a sbuild for Ubuntu 14.04. If you don't have one, it's time to make one now. You want to follow the excellent article on the Debian wiki for this. Many parts are just copy-paste but the final command you need to run is this:

sudo sbuild-createchroot --make-sbuild-tarball=/var/lib/sbuild/trusty-amd64.tar.gz trusty `mktemp -d` http://archive.ubuntu.com/ubuntu

Note that you cannot _just_ run it as there are some new groups you have to add and have available. Go read the article for that.

So with the sbuild ready to build our trusty packages, let's see what we get. Note, that in general, the process involves just those four steps.

  1. sbuild -d trusty
  2. dch # edit the changelog
  3. dpkg-buildpackage -S -sa
  4. dput ppa:zyga/launchpadlib-backports ../*.source.changes
After doing step one you'll see that you have missing build dependencies. Those are python-keyring, lazr.restfulclient, python-oauth and python-wadllib. We need to backport those too!

At this time the process recursively continues. You grab a .dsc file with dget it, and try to sbuild it right there. Luckily, you will find that nothing here has more dependencies and that each of the four packages builds cleanly.

At this time, you want to create a new PPA. Just go to your launchpad profile page and look for the link to create it. The PPA will serve as a buffer for all the packages so that we can finally build the package we are after. Without the PPA we'd have to build a local apt repository which is just mildly more difficult and not needed since we want to share our packages anyway.

With the PPA in place you can now start preparing each package for upload. As a habit I bump the version number and change the target distribution version from wily / unstable to trusty. I also add a changelog entry that explains this is a backport and mentions the name of the PPA. The version number is a bit more tricky. You want your packages to be different from any packages in the Ubuntu archive so that eventual upgrades work okay. The way I do that is to use a (x-).1 version which is always smaller than the corresponding x Ubuntu version. Let's see how this works for each of the packages we have here.
  • lazr.restfulclient has the Debian version 0.13.4-5 which I change to 0.13.4-5ubuntu0.1. This way both Ubuntu can upload 0.13.4-5ubuntu1 and Debian can upload 0.13.4-6 and users will get the correct update, nice.
  • python-keyring has the Ubuntu version 4.0-1ubuntu1 which I change to 4.0-1ubuntu1.1 so that the subsequent 4.0-1ubuntu2 can be uploaded to Ubuntu without any conflicts.
  • python-oauth has the Debian version 1.0.1-4 which I change to 1.0.1-4ubuntu0.1 to let Ubuntu update to -ubuntu1 eventually, if needed.
  • python-wadllib has the Debian version 1.3.2-3 which I change to 1.3.2-3ubuntu0.1 in exactly the same way.
Have a look at an example changelog to get a feel of how this all works together.

Now all the dependencies are ready and I can do the final test build of launchpadlib itself. Since I always test-build everything I will now need to expose access to my PPA so that my sbuild (which knows nothing about it otherwise) can get the missing packages and build everything. This is the magic line that does it:

sbuild -d trusty --chroot-setup-commands='apt-key adv --keyserver keyserver.ubuntu.com --recv-key E62E6AAB' --extra-repository "deb http://ppa.launchpad.net/zyga/launchpadlib-backports/ubuntu trusty main"

Here, we have two new arguments to sbuild. First, we use --chroot-setup-commands to import the public key that launchpad is using to sign packages in my archive. Note that the key identifier is unique to my PPA (and probably to my launchpad account). You want to check the key listed on the PPA page you got. The second argument --extra-repository just makes our PPA visible to the apt installation inside the chroot so that we can resolve all the dependencies. On more recent versions of Ubuntu you can also use [trusted=yes] suffix but this doesn't work for Ubuntu 14.04.

After all the uploads are done you should wait and see that all the packages are built and published. This is clearly visible in the "package details" link on the PPA. If you see a spinning stopwatch then the package is building. If you see a green cogwheel then the package has built but is not yet published into the PPA (those are separate steps, like make and make install, kind of). When all the packages are ready I copied all the binary packages (without rebuilding) from trusty to utopic, vivid and wily so that anyone can use the PPA. The wily copy is a bit useless but it should let users use the same instructions even if they don't need anything, without getting weird errors they might not understand.

So there you have it. The whole process took around an hour and a half (including writing this blog post). Most of the time was spent waiting on particular test builds and on the launchpad package publisher.

If you have any comments, hints or suggestions please leave them in the commends section below. Thanks.
on July 06, 2015 11:35 AM
Having an @ubuntu.com email address has some clout. After trying for six months to get a response from the missing LoCo's in Africa using my old gmail address some of them didn't reply, now , retrying with @ubuntu.com I am getting somewhere. #ubuntu-africa is slowly growing and I have received some reply mails from other counties that are all pleased that there is a LoCo revival move underway. We have grown to the stage where we will have our first meeting on the 29th of this month at 20.30 UTC+2  Anyone else is welcome to join us. I hope to see many of you there on irc > freenode > #ubuntu-africa
Everyone is welcome to promote our site http://ubuntu-africa.info
on July 06, 2015 08:38 AM

Bad Tech Coming

Stephen Michael Kellat

In general, I don't like talking about products that haven't even had their formal presentations yet at places like DEF CON. As reported in The Register, a gentleman at Rhino Security named Benjamin Caudill created hardware to provide a 900 MHz remote bridge to a distant Wi-Fi network. Wired also takes time to write about this.

There are many, many, many problems with this. If anything I could write an entire "occasional paper" for presentation about this. While the security expert in question may have designed this system it runs afoul of FCC regulations in the United States at the barest minimum. Part 15 of Title 47 of the Code of Federal Regulations has many provisions that deal with unlicensed devices like this proposed hardware proxy that also stipulate hard power limits.

Having a 125 milliwatt signal carry for a mile is under rather optimal conditions considering that the device would be a secondary user of the frequency band and legally required to shut down if it interfered with licensed users let alone the primary Industrial/Scientific/Medical device users of the the band. In general it seems like 47 CFR 15.247 might apply. This section deals with spread spectrum hopping and how broad of a frequency range signals can hop over. For faster speeds via RF links you need broader RF signals that measure in the megahertz wide. If you have narrower bandwidth restrictions (not to be confused with wired broadband throughput), you are assured of not having a fast link.

Running outside the rules and playing poorly in the radio band is generally not tolerated. Amateur Radio (a.k.a "ham radio") operators use 902-928 MHz in ITU Region 2. For our discussion's purposes ITU Region 2 can be defined as the Americas, the Caribbean, Greenland, and some but not all islands in the eastern Pacific. Amateur Radio operators also engage in transmitter hunting which is a form of Radio Direction Finding for sport. Anybody misusing one of these devices is likely to be found easily by Amateur Radio operators and be reported to authorities. Considering the heavy transmitter duty cycle in maintaining a constant link with a remote head would mean this system would be an easier target to locate than most games of transmitter hunting played with weak-signal, intermittent, well-hidden transmitters.

In short, I'm not a programmer. The FCC still feels that I'm allowed to play with up to 1.5 kilowatts of transmitting power on a variety of bands to communicate as an Amateur Radio Operator. Far less complicated methods of making transmissions via radio do exist that reduce chances of being caught. Ary Boender has an entire website devoted to clandestine transmissions and there is an entire forum devoted to discussing these things. The Lincolnshire Poacher served its purpose for many years. Why ignore tradition and history when it shows far less complicated ways of accomplishing things?

on July 06, 2015 12:00 AM

July 05, 2015

Stepping out for a while

Nekhelesh Ramananthan

Contributing to the Ubuntu Touch has been one of the most incredible journeys that I have undertaken. Working late nights during the weekend while coordinating with other developers on IRC, meeting new people, taking on responsibility for important stuff on the phone were some of the perks of being a core apps developer. For the first time I was able to truly experience being part of a community. It was also during this time that I got my Ubuntu membership.

I think I have been contributing for about 2 years since Ubuntu Touch was announced and I am slowly starting to feel a sense of restlessness and am afraid of burning out. I am feeling a bit pressurized and desperately need a break before I explode. I am also at a critical point in my life where I need to step away and focus on my personal life.

This has been one of the hardest things for me to admit, because of how much I loved hacking on Ubuntu Touch. However It gives me an immense sense of pride when using the BQ E4.5 as my daily phone because of my involvement in making it a reality, however small that contribution was in the grand scheme of things.

I suspect that I may be gone for about a year to sort out my personal life. But hopefully this break will give me a fresh perspective and renewed enthusiasm when I return. I think there are 2 sides of the coin here, i.e. being a developer and a consumer. Up until now, I have been a developer working to improve the platform. Now I will be on the other side of the table as a consumer using Ubuntu Touch on a daily basis and reading news as an outsider ;)

Over the next few days, I will spend my time trying to make this transition easier for my colleagues friends by transferring my personal bazaar branches to project branches that any other team member could take over and finish the work.

So Long, and Thanks for All the Fish.

on July 05, 2015 10:14 PM

Hi,

10 days ago, we have asked our great community to share their opinion regarding both Ubuntu GNOME 14.04 LTS and Ubuntu GNOME 15.04. During these 10 days, 500 users have shared their opinion – WOW!

Ubuntu GNOME Team would like to thank each and everyone who had taken the time to read and answer our questions.

Thank you so much!

This is to announce the end of collection/receiving feedback from our lovely community. Soon (hopefully), we shall share the results and right after that, we shall start planning for 15.10 as per your feedback.

We highly appreciate your help, support and direct contribution to make Ubuntu GNOME even better, Community wise and Distribution wise.

Please note: we have stopped accepting responses.

Thank you!

on July 05, 2015 07:50 PM

Review Meizu MX4 - Ubuntu Edition

Sujeevan Vijayakumaran

Since the end of June one can buy the Meizu MX4 in the EU, if you manage to get an invite. I'm in the group of the „Ubuntu Phone Insiders“ and got the device a few of weeks ago. It's time for a review!

A few months back I did a long review of the bq Aquaris E4.5. The Meizu MX4 is the third available Ubuntu Phone, next to the Aquaris E4.5 and the Aquaris E5, both from bq. The MX4 is the first device from the chinese manufacturer Meizu.

The Software

I've already covered all the software features in my review of the Aquaris E4.5. The software on both phones is pretty much the same. The installed apps and scopes are the same. Over the last couple of months Canonical rolled out a few updates of the system, this included the update of the base system from 14.10 to 15.04 Vivid Vervet. Many bugs were fixed and many features were added. Therefore, this article doesn't mention many software features.

All software features which are associated with the hardware of the MX4 itself, are mentioned in the upcoming sections.

Hardware

The Screen

One of the best things about this device is the screen. The size of the screen is pretty big at 5,36 inches. It has a high resolution of 1920 x 1152 pixels which results in a pixel density of ~418 ppi. The Aquaris E4.5 on the other side has 240 ppi on a 4.5 inch screen.

Besides the numbers, the screen is truly very great. The fonts, the photos and also app icons are pretty sharp on this screen. Also, the brightness and the colors are excellent.

On the other hand, the user interface doesn't yet make the most out of the high resolution. App icons and fonts are pretty huge, the App Scope e.g. only shows three app icons next to each other. This will be hopefully fixed with the next OTA software update. So you can expect that the higher resolution will be better used in the near future.

The Camera

Next to the display, the camera is another thing which is great. It makes photos with 20,7 Megapixels and the photos are pretty good! The camera of the Aquaris E4.5 makes washy photos with a low color reproduction even on good lighting conditions. The MX4 on the other side even makes good photos in low light conditions.

There is one disadvantage though: The camera doesn't seem to focus properly in low light conditions. Therefore, pictures get blurred pretty easily.

Mobile data and WiFi

Many people criticised the missing LTE support on the bq Aquaris E4.5. The MX4 does support the LTE frequencies used by european network providers. It also supports HSDPA. The phone has one Micro SIM slot on the backside, you have to remove the back cover to insert the sim card.

There is obviously also WiFi available. The automatic switch between mobile data and WiFi is really buggy, it often happens that the device doesn't connect automatically to the saved WiFi Access-Points, even if they're available. If you manually click on the WiFi Access-Point then it does connect normally. Similar things happen the other way around: sometimes it says that you're still connected to the WiFi, even if you moved out of coverage. I hope this will be fixed in the near future.

Battery life

The battery has a capacity of 3100 mAh. It powers the MediaTek Octa-Core CPU, 2GB of RAM and the big display. Sadly there is a bug in the system, which consumes too much energy even if you the phone is idling in standby. Therefore the battery often discharges very fast. A similar bug was also present on the Aquaris E4.5, but that one was fixed a few weeks before it got delivered to the first customers.

You can remove the back cover of the phone, but the battery cannot be removed.

Haptics and Quality

The overall finish is great. The frame and the backside are made of aluminium. The device isn't creaking on any point. It has a higher build quality than the Aquaris E4.5. The screen is protected by Gorilla Glass 3 which should be scratch-resistant. Anyway my phone got a few scratches after I carried it in my pocket for a few weeks. The glass on the camera on the other side is protected by a Saphire Glass. Also, the back cover got a small scratch. There are three hardware buttons on the device: two volume buttons and the power button. The volume buttons are on the left side and the power button at the top. The position of the latter is a disadvantage because you can't easily reach it with a finger. There also is a soft home button below the display.

Even if the device is pretty big, it feels comfortable in my hand. At the beginning you have to be carefully to get used to the slippery back. Otherwise you might drop it.

Performance

The phone is powered by a MediaTek Octa-Core CPU. It has four ARM Cortex-A17 cores and four ARM Cortex-A7 cores. It follows the big.LITTLE CPU architecture. The higher powered cores are only used when they are really needed, otherwise the lower powered cores are being used. This results in a lower overall power consumption.

Theoretically the CPU can be pretty fast. In practice the system isn't as fast as you might expect when compared to high end Android phones. There are still many stutters when you switch between scopes. The start-up time for apps is still relatively high at roughly three seconds.

The device has 2 GB of RAM, but it seems that apps are often killed and automatically restarts when you switch between them. This is another bug which is currently worked on, but it really is annoying for the end user.

Conclusion

Actually the device has pretty good hardware, particularly the display is really good. The sad thing is, that the software isn't in a satisfying state. The Aquaris E4.5 definitely has much better software support than the MX4. For example there is the soft home button below the display which also includes the notification LED, which is actually not working right now. When you press the button, you switch to the first scope. This is actually a breach with the Ubuntu Phone UX design, which doesn't need any buttons besides the power and volume keys. One often accidentally presses the home button when one use a bottom edge gesture.

Another big problem is the overheat of the device. Even if you only use the phone for a couple of minutes it heats up extremely. It's not happening all the time, but if you play games or use the browser for example, then the temperature goes up to beyond 40°C.

Besides of all the negative points, the MX4 is still faster than the Aquaris E4.5 on a daily usage. The apps are starting faster, but many elements of the system are still laggy and stuttering, similar to the Aquaris E4.5. It seems for me, that the MX4 didn't get much love from the developers compared to the E4.5. The latter device has a far better hardware support. Many things will be hopefully fixed with the upcoming updates. The issues with the battery and the WiFi-Connection are one of the things which are annoying every day. Anyway the MX4 will be in my pocket next to my Nexus 4 with Android. The bq Aquaris E4.5 was mostly off and at home. I really love the screen and the camera and I hope that software will get better over time!

on July 05, 2015 09:20 AM

July 04, 2015

Yes, I'm going to talk about Apple Music, like literally everyone else online these days.

So I cancelled my premium Spotify subscription after using Apple Music for literally a few days, mainly for one reason: integration. Apple Music managed to combine my existing music library with streaming and their new "revolutionary" *eye roll* radio in such a way that it's more convenient than Spotify.

As iTunes is always open & playing while I'm working on something it became simpler for me to switch from my music to streaming with the Apple Music integration, than to open Spotify.

I'm essentially buying into Apple's media empire even more.

The New Music App

I mostly use Apple Music on my iPad and while I'm not going to pick on the app UI too much, as this post is about Apple Music the service, it is rather convoluted especially when you're trying to find music. Where there used to be a breakdown of your music library, there's now Apple Music related tabs.

"Play this Siri"

Music's integration with Siri is fantastic and makes up for what's lacking in-app for finding songs. You just ask Siri to play any specific album, song or artist (I've yet to stump her) and it plays right away.

Human Radio on Beats 1

Beats 1 is the most compelling thing about Apple Music and it's largest differentiator, when compared with other services. It adds personality to the service by having live people host shows and play music that's to their tastes or that they feel deserves being played. It's also quite obvious that with Beats 1, Apple wants to be a trendsetter for music.

The shows on Beats 1 are varied enough to get a range in listening options when you tune in and you can hear the differences in music tastes of the various hosts, which diversifies the service for difference audiences. The nicest surprise for me was hearing some of the more niche items from my personal library airing on Beats 1 –which you'd never hear on other radio stations. Also, the lack of annoying ads –except for the rare few second interstitial saying "Beats 1 is made possible by..." (whom I'm convinced is Eddie Izzard)– and the usual annoying radio gimmicks is also nice.


If you're not using a streaming service and you have an Apple device/computer, Apple Music is a no-brainer –and this is definitely how Apple will get immediate market penetration, by giving it to everyone with an iDevice. Besides you might as well avail yourself of the free 3 month trial and see if you'll stick with it for your streaming solution –that is, if you're into streaming in the first place.

But if you're on something other streaming service, and you're not that into Apple or have no interest in some of Apple Music's differentiating features, than don't bother switching. All these services have seem to have the same ~30 million song library.

on July 04, 2015 03:00 PM

July 03, 2015

The Aeropress is rather a cult coffee brewer and few non-coffee-nuts know about it. But from the first cup of coffee I brewed with it, I was hooked. I now use it regularly for making coffee and my French press has been relegated to the back of the cupboard. #sorrynotsorry

The Aeropress' brilliance is in its simplicity and ease-of-brew. Measure in your coffee, briefly steep your grounds in the main chamber and then plunge the coffee through a paper filter directly into a mug. Cleanup is simple: unlock & remove the filter cap, press out the coffee puck + paper filter then rinse/wipe off the plunger. Simple.

As for brew quality, if you're consistent, it's consistent. It doesn't make the greatest cup I've ever had, but it does produce a nice, smooth cup of coffee and you can vary that quite a bit with ground levels and types.

The Aeropress' design is rather underwhelming which may make you skeptical of its capabilities. It is (to borrow a Jony Ive-ism) unapologetically plastic. Which, I suspect, makes it more durable and keeps it relatively inexpensive –you can get one for around 30$USD

The take away: it consistently brews a great cup of coffee, it takes only a few seconds to clean and it has a plethora of accessories and brewing variations –if you're into that.

on July 03, 2015 09:55 PM

Ubuntu announced its 14.10 (Utopic Unicorn) release almost 9 months ago, on October 23, 2014. As a non-LTS release, 14.10 has a 9-month month support cycle and, as such, the support period is now nearing its end and Ubuntu 14.10 will reach end of life on Thursday, July 23rd. At that time, Ubuntu Security Notices will no longer include information or updated packages for Ubuntu 14.10.

The supported upgrade path from Ubuntu 14.10 is via Ubuntu 15.04. Instructions and caveats for the upgrade may be found at:

https://help.ubuntu.com/community/VividUpgrades

Ubuntu 15.04 continues to be actively supported with security updates and select high-impact bug fixes. Announcements of security updates for Ubuntu releases are sent to the ubuntu-security-announce mailing list, information about which may be found at:

https://lists.ubuntu.com/mailman/listinfo/ubuntu-security-announce

Since its launch in October 2004 Ubuntu has become one of the most highly regarded Linux distributions with millions of users in homes, schools, businesses and governments around the world. Ubuntu is Open Source software, costs nothing to download, and users are free to customise or alter their software in order to meet their needs.

Originally posted to the ubuntu-announce mailing list on Fri Jul 3 13:00:54 UTC 2015 by Adam Conrad, on behalf of the Ubuntu Release Team

on July 03, 2015 09:40 PM

Plasma532

plasma-5.3-beta-750x422

 

No backports PPA required.

Plasma 5.3.2.

Daily Wily Images.

on July 03, 2015 02:06 PM

Announcing the MX4 Challenge Winner

Nekhelesh Ramananthan

Hello everyone, I am thrilled to announce the winner of the Meizu MX4 Challenge! In the interest of being open, I thought I would share a bit on how we arrived at the winner. We received 10 submissions for the challenge and while that may not seem like a lot, I was personally looking for quality submissions, proven track-record of previous contributions and trying to bring developers and designers on the same playing field. And so in the process created strict rules and requirements that reduced the amount of people who could enter the challenge.

I reviewed the past contributions of the participants and checked whether they took the effort to present their apps nicely in the store and keep them updated. I observed that there were some apps that did not have screenshots or a proper description to help the user. Many a times I also found brilliant apps in the store that people start relying on, only later to notice that they have been abandoned. All these things affect users and I emphasised on them for this challenge.

It is not the amount of apps that you have in the store that matter, but rather the effort you put into making your app the best.

The contest submissions were reviewed by Alan Pope and myself. I must admit that at the end finding a winner was a bit difficult considering that a few of the participants and their submissions were quite good. Luckily I had Alan Pope to help me with breaking the tie.

I wish I had more devices to give away but alas there is just the one Meizu MX4 ;)

So without further ado,

Drum roll ... the winner is Brian Douglass!

Brian is the developer of the well know alternative ubuntu app store Uapp Explorer that we all have come to admire and is used by pretty much everyone to discover new apps, share public links to their apps.

For the contest, he created a UApp Explorer scope. You can check out the screenshots below.

image1 image2

Given the amount of time the developers and designers had to take part in the challenge, I am happy with the submissions and thank all the participants for their effort. I hope you would continue improving on the stuff that you worked on and help improve the Ubuntu Touch ecosystem.

on July 03, 2015 12:05 PM

Balsamiq is one of the best tools for quick wireframes creation. It allows you to efficiently and quickly create mockups that give you an idea of how design elements fit in the page.

Some years ago there was a package available for the most popular Linux distributions, but since Adobe dropped support for Linux and Balsamiq is built on top of Adobe Air, nowadays they don’t support Linux neither.

As you can see from the downloads page of Balsamiq, though, it luckily works well with wine.

Install Balsamiq with WINE

First things first: install wine.

sudo apt-get install wine

Now, let’s proceed with an easy step-by-step guide.

  1. Download the Balsamiq Bundle that includes Adobe Air
  2. Open a terminal, unzip the bundle and move it to /opt (change the Downloads directory name according to your setup)
    cd Downloads
    unzip Balsamiq*
    sudo mv Balsamiq* /opt
  3. To make life easier, rename the .exe to simply balsamiq.exe
    cd /opt/Balsamiq_Mockups_3/
    mv Balsamiq\\ Mockups\\ 3.exe balsamiq.exe
  4. Now you can run Balsamiq Mockups by running it with wine
    wine /opt/Balsamiq_Mockups_3/balsamiq.exe

Add Balsamiq as an application

The last optional step can save you a lot of time in launching Balsamiq, because it saves you the hassle of writing the command in point 4 above every time you want to launch it (and remembering the Balsamiq executable location). This simply consists in creating a new desktop entry for Balsamiq, which will add it to the applications list of your operating system.

Create the file ~/.local/share/applications/Balsamiq.desktop with the following content:

[Desktop Entry]
Encoding=UTF-8
Name=Balsamiq Mockups
Icon=/opt/Balsamiq_Mockups_3/icons/mockups_ico_48.png
Exec=wine /opt/Balsamiq_Mockups_3/balsamiq.exe
Type=Application
Categories=Graphics;
MimeType=application/x-xdg-protocol-tg;x-scheme-handler/tg;

If you are on Ubuntu with Unity, you can add the following lines too:

StartupNotify=false
StartupWMClass=balsamiq.exe
X-UnityGenerated=true

Now, just save and have a look at your Dash or Activity Panel to see if it works.

The post Install Balsamiq Mockups in Debian/Ubuntu appeared first on deshack.

on July 03, 2015 09:16 AM

S08E17 – Gigli - Ubuntu Podcast

Ubuntu Podcast from the UK LoCo

It’s Episode Seventeen of Season Eight of the Ubuntu Podcast! Alan Pope, Mark Johnson, Laura Cowen, and Martin Wimpress are all together again and speaking to your brain.

In this week’s show:

That’s all for this week, please send your comments and suggestions to: show@ubuntupodcast.org
Join us on IRC in #ubuntu-podcast on Freenode
Follow us on Twitter
Find our Facebook Fan Page
Follow us on Google+

on July 03, 2015 07:32 AM

July 02, 2015

One new update in Lubuntu 15.04 that hasn't got much attention is there was an update in lxrandr to 0.3.0 Has improved multiple montior support with quick options to allow you to show different screens either above or to the right of the primary monitor or a simple checkbox to for example select only show external monitor if you plug laptop into a new monitor with a much higher resolution, for example , 1920x1080 than a 1366x768 screen built into my aging over 5 year old laptop. For previous or even more complicated setups previously I had used the heavier resource wise arandr which is a quite nice application for setting up dual screen displays but written in python2 and gtk2. I remember being at ubucon at scale in 2014 and tihnking hmm it would be easier if the presenter was using arandr to setup the multiple monitor setup. Although even among people that love arandr if you install it there is also an unxrandr command that gives your current screen configuration and outputs how to recreate it with xrandr which If you bind it to a key for a multiple monitor setup could be conbient. This function needs the functionality of arandr and cannot easily be released as a standalone program.

This makes it a lot easier to set up mulitple monitors or even have just use the external one if it is much bigger and multiple monitors of different sizes doesn't appeal to you.

http://brendanperrine.com/screenshotgal/quickoptions.png

This is a view of a simple screenshot of this new interface. Although the manual test for lxrandr never got updated I think I started on work on that but never finished.

on July 02, 2015 08:13 PM
More often than not, I'm looking at ACPI and UEFI related issues, so I was glad to see that  Vincent Zimmer has collected up various useful links to blogs that are Firmware Related.   Very useful, thanks Vincent!
on July 02, 2015 04:23 PM
Cada vez disponemos de más móviles para comprar con Ubuntu Phone.
He tenido la suerte de poder probar los 3 modelos. Dispones de las reviews correspondientes aquí: BQ E4.5, BQ E5 y Meizu MX4.

Con esta perspectiva, ¿Qué modelo comprar?


E5 | MX4 | E4.5


El BQ E4.5 es el que tiene el sistema operativo más fluido y es que salir 5 meses antes que los otros dos influye.

Yo personalmente compraría el MX4 por un sólo motivo: Su extraordinaria cámara. También sobresale por su pantalla con una resolución envidiable.

Si se busca duración de batería, el E5 es el que mejor resultado obtiene, pues su procesador es igual que el E4.5 pero con más mAh de batería.

Por precio, compraría el E5, por 30€ más se obtienen muchas más prestaciones que el E4.5.


E5 | MX4 | E4.5


Por diseño: MX4
Por pantalla: MX4
Por cámara trasera: MX4
Por cámara delantera (para selfies): E5
Por conectividad de datos: MX4
Por batería: E5
Por precio: E4.5
Por almacenamiento: E5
Por potencia (CPU+RAM): MX4

Este última valoración por potencia, puede que sea importante a medio plazo, para usar el móvil como CPU con un monitor.

Fotografías por David Castañón bajo licencia CC BY 3.0.
on July 02, 2015 02:06 PM

June has been a great month. A new Ubuntu device, the Meizu MX4, reached the market. I wrote a little review about it, and now I use it as mainly phone.

Also, I went to the Hello Tomorrow Conference. An incredible [experience][htc15].

And last but not least, the Discerning Duck scope had an update.

But now let’s see what I did in the last month for the opensource world :-)

Donations

In June I received 5 euros of donations. I’ll spare them to renew the VPS later this year.

If you find valuable my contribute to opensource world, please consider to make me a donation.

What I did

In June I focused mainly on webbrowser. Now there are 8 branches waiting for a review ready to land in the browser app, with little and big improvements!

I also created a new branch for Oxide, should be merged in next days.

Other than that, I did a lot of code reviews, both for the webbrowser, to have the best experience possible, and for some core apps (calculator and reminders).

Also, I’m working on a new app that I hope to publish in a couple of weeks ;-)

If you like my work and want to support me, just send me a Thank you! by email or offer me a beer:-)

Ciao,
R.

on July 02, 2015 10:00 AM

dah-di-dah

dah-di-dah what?


Whiskey Hotel Whiskey

You have just downed a bottle of whiskey. You go to hotel, and then you have some more whiskey. These steps, while extremely repeatable, will have drastically different outcomes for each person. One of these outcomes is a Do-It-Yourself disaster. Another is “How to get a hangover… or worse,” depending on what else is brought into the mix.

So, what happens when you follow this recipe?

The culture of the internet world is changing. Creation is shifting to curation. With so much information right at your fingertips, almost anyone can learn how to do almost anything. A quick g00gle search will already tell you how to get 6-pack abs, and build a flame thrower. i.e. You don’t need to re-write that and create more GI. However, what we don’t know, and possibly want to know, is what happens/happened when you got your abs and made a dangerous weapon. This is unique information that only you can produce. Thus, we should continue to create.

DIY ain’t dead. You should absolutely do things yourself for the learning experience. This also does not mean that you should not write How-Tos; there are tonnes of things that we don’t know how to do, or have learned how to do incorrectly.

But, don’t stop there (or do if it has been over done). Teach us about your experience. Tell us What Happens When (WHW).

What? So What? Now What?

WHW as performance art?

WHW as performance art?

Robots are better than I am at my job, or they soon will be. In education, many kids have lost motivation and can’t concentrate. The internet is a better teacher than I am. Children know what they are supposed to learn, and in general they understand why they are supposed to learn it (they just saw it all online yesterday). However, a frequently missed component in education is what we can or should actually do with that knowledge. i.e. What happens when I apply this information?

I was recently taught this process of inquiry:

What: What are you taking about?
So What: Why are you talking about it?
Now What: What do you do about it?

The last piece is really what still makes community and classrooms relevant, but sometimes we forget to teach that. Maybe it’s because of our consumption-only habits. Maybe it’s because someone wants to keep us under control. Maybe it’s because we keep stopping at “maybe,” and only choose to watch from behind the glass.

Technology is here.
It will can help us.
Now what do we do with it?

Caveat Emptor Rex

Steggers

I’m Steggers

“What part of recreating dinosaurs and putting them in a theme park was a good idea?”

Many story premises are ridiculous, but it is undeniable that they are also entertaining. If we wrote a story exclusively about “how to extract dinosaur DNA,” then we might be sorely disappointed; however, we can’t help but wonder WHW we bring dinosaurs back to life and put them in close proximity to people. (SPOILER ALERT: things get ate).

The Jurassic world in which we live has an appetite of curiosity. In some cases, our brains are still quite primitive. We do a lot of stupid things all the time that slip through the systems unobstructed. So, Mr. Chricton’s premise is actually extremely insightful, and is an extreme example of the primal nature that continues to run the world today. Not to mention, it’s extremely interesting. Extreme!

If you bought into the idea of adding more value to the things that we create, then we must also be aware of the underlying dangers that are already present.

WHW is actually responsible for some of the dino dung that we’re facing today. When we keep feeding our ancient reptilian brains with consumer urges, we just perpetuate the problem. WHW we make it bigger, faster, scarier? (SPOILER ALERT: things get ate bigger, faster, scarier). Sometimes the consumer doesn’t know what’s best for itself; what it wants (or has been trained to want), isn’t always what it needs.

We need to run the scenarios in our heads first. We need to focus on what benefits the whole rather than what gets me more tokens. We need to then make those things happen.

When we get involved in the outcome, then we can get out of the “safety” of our voyeuristic tendencies that lead to destructive demands and curmudgeonist complaints.

We need a deep sense of community.

OUR Situation

Hello there

Hello there

The Obvious Ubuntu Relevance is right in front of our faces. That circle of friends is severely effected by the action or inaction of each member.

I have bought an Ubuntu device because it’s awesome, but what am I going to do with it?

I have joined an Ubuntu circle because I need community, but what am I going to contribute?

I know how to do these things. But, WHW I actually do something with this knowledge?

It’s cool to sit back and gain confidence—be rational—before producing something in the community, but it’s important that we don’t get stuck at the instructional stage. Furthermore, our contributions need not be that extreme!

Consistent application, experience, and collaboration should lead to progress.

A little less. A little more. (source)

A little less… A little more… (source)

on July 02, 2015 03:54 AM
It's never been easier to write tests for your application! I wanted to share some details on the new documentation and other tidbits that will help you ensure your application has a nice testsuite. If you've used the SDK in the past, you understand how nice it can make your development workflow. Writing code and running it on your desktop, device, or emulator is a snap.

Fortunately, having a nice testsuite for your application can also be just as easy. First, you will notice that now all of the wizards inside the SDK now come with nice testsuites already in place. They are ready for you to simply add-on more tests. The setup and heavy lifting is done. See for yourself!


Secondly, developer.ubuntu.com has a great section on every level of testing; no matter which language you use with the SDK. You'll find API references for the tools and technology used, along with helpful guides to get you in the proper mindset.

For autopilot itself, there's also API documentation for the various 'helpers' that will make writing tests much easier for you. In addition, there's a guide to running autopilot tests. This has been made even easier by the addition of Akiva's Autopilot plugin inside the SDK. I'll be sharing details on this as soon as it's packaged, but you can see a sneak peek in this video.

Finally, you will find a guide on how to structure your functional tests. These are the most demanding to write, and it's important to ensure you write your tests in a maintainable way. Don't forget about the guide on writing good functional tests either.

No matter what language or level you write tests for, the guides are there to help you. Why not trying adding a test or two to your project? If you are new, check out one of the wizards and try adding a simple testcase. Then apply the same knowledge (and templated code!) to your own project. Happy test writing!
on July 02, 2015 03:16 AM

Snappy Open House!

Nicholas Skaggs

Introducing Snappy Open Houses! Snappy represents some new and exciting possibilities for ubuntu. A snappy open house is your chance to get familiar with the technology while helping test and break things ! We plan to do an open house before each release as a chance for everyone to interact and provide feedback and help with testing. As such, this is a great way to get started in the snappy world!

So what exactly do I mean by an open house? We want to encourage the community to test with us, explore new features, check for possible regressions and exercise the documentation.  An open house is a chance to come and meet the snappy team developers and help QA test the new image.

During the open house, we'll host a live broadcast on ubuntuonair.com. As part of the broadcast, we'll speak with some of the developers behind snappy and show off new features in the upcoming release. We'll also demonstrate how to flash and test the new release so you can follow along and help test. Finally we'll answer any questions you have and stick around on IRC for a bit to discuss any issues found during testing.

In other words, it's time intended for you to come and try out snappy! I know what you are thinking, "I don't have a cool IoT device to run snappy with". You are in luck! You can run snappy on your desktop and laptop. You don't need a device as you can install snappy on your local machine via kvm. If you do have a device, bring it and prepare to have some fun!

The first of these snappy open houses will be July 7th at 1400 UTC. Please stop by and help test with us, try out snappy, and meet the snappy team!

You can find out more information on the wiki. Mark your calendars and see you next Tuesday!
on July 02, 2015 02:54 AM

July 01, 2015

Convergence through Divergence

Sebastian Kügler

It’s that time of the year again, it seems: I’m working on KPluginMetaData improvements.

In this article, I am describing a new feature that allows developers to filter applications and plugins depending on the target device they are used on. The article targets developers and device integrators and is of a very technical nature.

Different apps per device

This time around, I’m adding a mechanism that allows us to list plugins, applications (and the general “service”) specific for a given form factor. In normal-people-language, that means that I want to make it possible to specify whether an application or plugin should be shown in the user interface of a given device. Let’s look at an example: KMail. KMail has two user interfaces, the desktop version, a traditional fat client offering all the features that an email client could possibly have, and a touch-friendly version that works well on devices such as smart phones and tablets. If both are installed, which should be shown in the user interface, for example the launcher? The answer is, unfortunately: we can’t really tell as there currently is no scheme to derive this information from in a reliable way. With the current functionality that is offered by KDE Frameworks and Plasma, we’d simply list both applications, they’re both installed and there is no metadata that could possibly tell us the difference.

Now the same problem applies to not only applications, but also, for example to settings modules. A settings module (in Frameworks terms “KCM”) can be useful on the desktop, but ignored for a media center. There may also be modules which provide similar functionality, but for a different use case. We don’t want to create a mess of overlapping modules, however, so again, we need some kind of filtering.

Metadata to the rescue

Enter KPluginMetaData. KPluginMetaData gives information about an application, a plugin or something like this. It lists name, icon, author, license and a whole bunch of other things, and it lies at the base of things such as the Kickoff application launcher, KWin’s desktop effects listing, and basically everything that’s extensible or uses plugins.

I have just merged a change to KPluginMetaData that allows all these things to specify what form factor it’s relevant and useful for. This means that you can install for example KDevelop on a system that can be either a laptop or a mediacenter, and an application listing can be adapted to only show KDevelop when in desktop mode, and skipping it in media center mode. This is of great value when you want to unclutter the UI by filtering out irrelevant “stuff”. As this mechanism is implemented at the base level, KPluginMetaData, it’s available everywhere, using the exact same mechanism. When listing or loading “something”, you simply check if your current formfactor is among the suggested useful ones for an app or plugin, and based on that you make a decision whether to list it or skip it.

With increasing convergence between user interfaces, this mechanism allows us to adapt the user interface and its functionality in a fully dynamic way, and reduces clutter.

Getting down and dirty

So, how does this look exactly? Let’s take KMail as example, and assume for the sake of this example that we have two executables, kmail and kmail-touch. Two desktop files are installed, which I’ll list here in short form.

For the desktop fat client:

[Desktop]
Name=Email
Comment=Fat-client for your email
Exec=kmail
FormFactors=desktop

For the touch-friendly version:

[Desktop]
Name=Email
Comment=Touch-friendly email client
Exec=kmail-touch
FormFactor=handset,tablet

Note that that “FormFactors” key does not just take one fixed value, but allows specifying a list of values — an application may support more than one form-factor. This is reflected throughout the API with the plural form being used. Now the only thing the application launcher has to do is to check if the current form-factor is among the supplied ones, for example like this:

foreach (const KPluginMetaData &app, allApps) {
    if (app.formFactors().count() == 0 || app->formFactors().contains("desktop")) {
        shownAppsList.append(app);
    }
}

In this example, we check if the plugin metadata does specify the form-factor by counting the elements, and if it does, we check whether “desktop” is among them. For the above mentioned example files, it would mean that the fat client will be added to the list, and the touch-friendly one won’t. I’ll leave it as an exercise to the reader how one could filter only applications that are specifically suitable for example for a tablet device.

What devices are supported?

KPluginMetaData does not itself check if any of the values make sense. This is done by design because we want to allow for a wide range of form-factors, and we simply don’t know yet which devices this mechanism will be used on in the future. As such, the values are free-form and part of the contract between the “reader” (for example a launcher or a plugin listing) and the plugins themselves. There are a few commonly used values already (desktop, mediacenter, tablet, handset), but in principle, adding new form-factors (such as smartwatches, toasters, spaceships or frobulators) is possible, and part of its design.

For application developers

Application developers are encouraged to add this metadata to their .desktop files. Simply adding a line like the FormFactors one in the above examples will help to offer the application on different devices. If your application is desktop-only, this is not really urgent, as in the case of the desktop launchers (Kickoff, Kicker, KRunner and friends), we’ll likely use a mechanism like the above: No formfactors specified means: list it. For devices where most of the applications to be found will likely not work, marking your app with a specific FormFactor will increase the chances of it being found. As applications are being adopted to respect the form-factor’s metadata, its usefulness will increase. So if you know your app will work well with a remote control, add “mediacenter”, if you know it works well on touch devices with a reasonably sized display, add “tablet”, and so on.

Moreover…

We now have basic API, but nobody uses it (a chicken-and-egg situation, really). I expect that one of the first users of this will be Plasma Mediacenter. Bhushan is currently working on the integration of Plasma widgets into its user interface, and he has already expressed interest in using this exact mechanism. As KDE software moves onto a wider range of devices, this functionality will be one of the cornerstones of the device-adaptable user interface. If we want to use device UIs to their full potential, we do not just need converging code, we also need to add divergence features to allow benefiting from the difference of devices.

on July 01, 2015 10:53 PM

kubuntu_desktop_1600x1200_hd-wallpaper-771776

Building on their UOS Hangout, the Kubuntu Podcast Team has created their second Hangout, featuring Ovidiu-Florin Bogdan, Aaron Honeycutt, and Rick Timmis, discussing What is Kubuntu?

on July 01, 2015 08:39 PM
Un móvil que lo defino con sólo una palabra: IMPRESIONANTE.

Meixu MX4 Ubuntu Edition

En una sobria caja blanca se presenta el móvil más potente gobernado por Ubuntu: El Meizu MX4.

Caja


La primera vez que lo coges sorprende su diseño, muy elegante, de líneas redondeadas y con un peso de sólo 147gr. para un tamaño de 144 x 75,2 x 8,9mm.

Diseño


Lo que más llama la atención es su portentosa pantalla de 5,36", con resolución 1920 x 1152 (418PPI) y cristal: Gorilla Glass 3. Esta pantalla ofrece una calidad extraordinaria. Se ve muy bien y una vez que te habitúes a este tamaño, te aseguro que no querrás un móvil más pequeño.

Pantalla

Es el único móvil Ubuntu con 4G. En mis móviles anteriores sólo disponía de 3G. El 4G se nota y mucho: Cualquier página o aplicación que use datos cargará en el acto.

Conectividad

Pero lo que personalmente más aprecio es su excepcional cámara trasera, no sólo por sus 20,7 mp protegidos también por un cristal Gorilla Glass 3, si no por el contraste, definición y color conseguido en sus fotografías.

Cámara
 Este es un ejemplo que disparé este fin de semana:

Ejemplo de calidad fotográfica

La batería tiene mucha capacidad: 3100mAh, aunque aún tiene que afinarse por parte de Canonical.


More than better!


El resto de hardware habla por si sólo: CPU dual quadcore (ARM A17 2.2GHz x 4 + ARM A7 1.7GHz x 4), RAM de 2 GB y 16GB de almacenamiento interno (no dispone para expansión por microSD).

MX4

Su precio: 299€. Nada caro para ser un gama alta y actualmente, el buque insignia de Ubuntu.

Puedes comprar el Meizu MX4 aquí.

Fotografías por David Castañón bajo licencia CC BY 3.0.
on July 01, 2015 03:15 PM

Add a C++ backend to your QML UI

Ubuntu App Developer Blog

Whether you are creating a new app or porting an existing one from another ecosystem, you may need more backend power than the QML + JavaScript duo proposed in the QML app tutorial.

Let's have a peek at how to to add a C++ backend to your application, using system libraries or your own, and vastly increase its performance and potential features.

In this tutorial, you will learn how to use and invoke C++ classes from QML and integrate a 3rd party library into your project.

Read the tutorial

on July 01, 2015 02:13 PM

June 30, 2015

Using the compiz grid plugin, Unity supports placing windows, one at a time, in a tiled-like fashion. However, there is no support for tilling a workspace in one fell stroke. That is something which users of dwm, wmii, i3, xmonad, awesome, qtile etc come to expect.

A few years ago I ran across a python script called stiler which tiled all windows, mainly using wmctrl. I’ve made a few updates to make that work cleanly in Unity, and have been using that for about a week. Here is how it works:

windows-enter is mapped to “stiler term”. This starts a new terminal (of the type defined in ~/.stilerrc), then tiles the current desktop. windows-j and windows-k are mapped to ‘stiler simple-next’ and ‘stiler simple-prev’, which first call the ‘simple’ function to make sure windows are tiled if they weren’t already, then focuses the next or previous window. So, if you have a set of windows which isn’t tiled (for instance you just exited a terminal), you can win-j to tile the remaining windows. Windows-shift-j cycles the tile locations so that the active window becomes the first non-tiled, etc.

This is clearly very focused on a dwm-like experience. stiler also supports vertical and horizontal layouts, and could easily be taught others like matrix.

If this is something that anyone but me actually wants to use, I’ll package properly in ppa, but for now the script can be found at
http://people.canonical.com/~serge/stiler .


on June 30, 2015 08:30 PM

Meeting Minutes

IRC Log of the meeting.

Meeting minutes.

Agenda

20150630 Meeting Agenda


Release Metrics and Incoming Bugs

Release metrics and incoming bug data can be reviewed at the following link:
– http://kernel.ubuntu.com/reports/kt-meeting.txt


Status: CVE’s

The current CVE status can be reviewed at the following link:
– http://kernel.ubuntu.com/reports/kernel-cves.html


Status: Stable, Security, and Bugfix Kernel Updates – Precise/Trusty/Utopic/Vivid

Status for the main kernels, until today:

  • Precise – Verification & Testing
  • Trusty – Verification & Testing
  • Utopic – Verification & Testing
  • Vivid – Verification & Testing

    Current opened tracking bugs details:

  • http://kernel.ubuntu.com/sru/kernel-sru-workflow.html
    For SRUs, SRU report is a good source of information:
  • http://kernel.ubuntu.com/sru/sru-report.html

    Schedule:

    cycle: 13-Jun through 04-Jul
    ====================================================================
    12-Jun Last day for kernel commits for this cycle
    14-Jun – 20-Jun Kernel prep week.
    21-Jun – 04-Jul Bug verification; Regression testing; Release


Open Discussion or Questions? Raise your hand to be recognized

No open discussion.

on June 30, 2015 05:11 PM

Publishing lxd images

Serge Hallyn

While some work remains to be done for ‘lxc publish’, the current support is sufficient to show a full cycle of image workload with lxd.

Ubuntu wily comes with systemd by default. Sometimes you might need a wily container with upstart. And to repeatedly reproduce some tests on wily with upstart, you might want to create a container image.

# lxc remote add lxc images.linuxcontainers.org
# lxc launch lxc:ubuntu/wily/amd64 w1
# lxc exec w1 -- apt-get -y install upstart-bin upstart-sysv
# lxc stop w1
# lxc publish --public w1 --alias=wily-with-upstart
# lxc image copy wily-with-upstart remote:  # optional

Now you can start a new container using

# lxc launch wily-with-upstart w-test-1
# lxc exec w-test-1 -- ls -alh /sbin/init
lrwxrwxrwx 1 root root 7 May 18 10:20 /sbin/init -> upstart
# lxc exec w-test-1 run-my-tests

Importantly, because “–public” was passed to the lxc publish command, anyone who can reach your lxd server or the image server at “remote:” will also be able to use the image. Of course, for private images, don’t use “–public”.

Enjoy!


on June 30, 2015 03:20 AM

Super star Ubuntu Weekly Newsletter contributor Paul White recently was reflecting upon his work with the newsletter and noted that he was approaching 100 issues that he’s contributed to. Wow!

That caused me to look at how long I’ve been involved. Back in 2011 the newsletter when on a 6 month hiatus when the former editor had to step down due to obligations elsewhere. After much pleading for the return of the newsletter, I spent a few weeks working with Nathan Handler to improve the scripts used in the release process and doing an analysis of the value of each section of the newsletter in relation to how much work it took to produce each week. The result was a slightly leaner, but hopefully just as valuable newsletter, which now took about 30 minutes for an experienced editor to release rather than 2+ hours. This change was transformational for the team, allowing me to be involved for a whopping 205 consecutive issues.

If you’re not familiar with the newsletter, every week we work to collect news from around our community and the Internet to bring together a snapshot of that week in Ubuntu. It helps people stay up to date with the latest in the world of Ubuntu and the Newsletter archive offers a fascinating glimpse back through history.

But we always need help putting the newsletter together. We especially need people who can take some time out of their weekend to help us write article summaries.

Summary writers. Summary writers receive an email every Friday evening (or early Saturday) US time with a link to the collaborative news links document for the past week which lists all the articles that need 2-3 sentence summaries. These people are vitally important to the newsletter. The time commitment is limited and it is easy to get started with from the first weekend you volunteer. No need to be shy about your writing skills, we have style guidelines to help you on your way and all summaries are reviewed before publishing so it’s easy to improve as you go on.

Interested? Email editor.ubuntu.news@ubuntu.com and we’ll get you added to the list of folks who are emailed each week.

I love working on the newsletter. As I’ve had to reduce my commitment to some volunteer projects I’m working on, I’ve held on to the newsletter because of how valuable and enjoyable I find it. We’re a friendly team and I hope you can join us!

Still just interested in reading? You have several options:

And everyone is welcome to drop by #ubuntu-news on Freenode to chat with us or share links to news we may found valuable for the newsletter.

on June 30, 2015 02:29 AM

Welcome to the Ubuntu Weekly Newsletter. This is issue #423 for the week June 22 – 28, 2015, and the full version is available here.

In this issue we cover:

The issue of The Ubuntu Weekly Newsletter is brought to you by:

  • Paul White
  • Elizabeth K. Joseph
  • And many others

If you have a story idea for the Weekly Newsletter, join the Ubuntu News Team mailing list and submit it. Ideas can also be added to the wiki!

Except where otherwise noted, content in this issue is licensed under a Creative Commons Attribution 3.0 License BY SA Creative Commons License

on June 30, 2015 01:49 AM

June 29, 2015

Just Say It!

Ted Gould

While I love typing on small on screen keyboards on my phone, it is much easier to just talk. When we did the HUD we added speech recognition there, and it processed the audio on the device giving the great experience of controlling your phone with your voice. And that worked well with the limited command set exported by the application, but to do generic voice, today, that requires more processing power than a phone can reasonably provide. Which made me pretty excited to find out about HP's IDOL on Demand service.

I made a small application for Ubuntu Phone that records the audio you speak at it, and sends it up to the HP IDOL on Demand service. The HP service then does the speech recognition on it and returns the text back to us. Once I have the text (with help from Ken VanDine) I set it up to use Content Hub to export the text to any other application that can receive it. This way you can use speech recognition to write your Telegram notes, without Telegram having to know anything about speech at all.

The application is called Just Say It! and is in the Ubuntu App Store right now. It isn't beautiful, but definitely shows what can be done with this type of technology today. I hope to make it prettier and add additional features in the future. If you'd like to see how I did it you can look at the source.

As an aside: I can't get any of the non-English languages to work. This could be because I'm not a native speaker of those languages. If people could try them I'd love to know if they're useful.


on June 29, 2015 04:29 AM

June 28, 2015

If you read my blog you already know that since April I’ve a job as developer at Archon. It’s a thing I enjoy a lot, and the last week has been awesome.

Archon joined the Hello Tomorrow Conference 2015, so last week I travelled to Paris.

And there I met some of the best people of the world, those that change the world, not only in the computer science world, but in every field.

Here some of things most inspired me, I hope they could inspire you too, and give you energy to be the change you want to see in the world.

Archon Team

First of all thanks to guys that were with me in Paris: Davide Venturelli, the CEO, works at NASA and is currently in charge of surveying the scientific investigations performed at the Quantum Artificial Intelligence Laboratory. What he does it’s incredible, and he motivated me a lot to follow my dreams now.

You can watch his pitch at the conference on Youtube (recorded by my Ubuntu Phone). Seriously, find 8 minutes today and watch it, so you can understand what Archon is about, and why I like it.

Giovanni Landi is our 3D expert, and has a lot of different passions. I had a lot of fun working with him at our stand, and I learned a lot of things about art (one of his passions).

Roberto Navoni is our hardware expert, and his life should be an inspiration for every Italian. He’s an entrepreneur, has created a company in Italy and despite the difficulties he didn’t expatriate.

Davide Ghezzi is our CFO. Unfortunately he was able to join us only for the first day, but he did a lot of things. I have no idea on how is possible a single man has so much energies, but wow!

The stand

I spent most of the time at our stand, where I explained what is our product to both potentially investors and casual visitors. As you can read, my English isn’t so good, so I was quite surprised everyone understood what I was saying.

Anyway, meet so many people from all around the world was amazing, everyone with incredible experiences and cool backgrounds. I spent a lot of time talking about the future, and how to do things that could impact the world. I listened to a lot of stories, and I remember each of them, because everyone was incredible.

The keynotes

During the event I was able to take a look to a couple of keynotes (I spent the rest of the time at the stand), and both were something you don’t see everyday.

The first one was by Obi Felten, moonshots at Google[X]. I don’t agree with a lot of Google’s policies, but the energies these guys have in trying to build something beautiful, and how they work hard with open minds is something that deserves deep respect and admiration.

The second one was by the CEO of G-Therapeutics. They have developed a working (but still in development) technology that helps paralyzed people walk again. Let me repeat: a stimulation system to rehabilitate individuals with spinal cord injury.

The presentation was the most moving thing I’ve ever seen, and it has earned minutes of applause.

The companies

Other than Archon, there were a lot of other interesting companies, both for what they do or for the stories of their founders.

I leave here a little list about the ones that I liked more, which it is far from complete (you can read the entire list on the Hello Tomorrow website).

  • Blitab is a braille tablet helping blind people. I love howntechnologies nowadays could help less lucky people to live a better life
  • BioCarbon Engineering is changing the world 1 billion trees at a time. They use drones to make precision planting and optimize reforestation. You know, trees don’t give free wifi, but they give oxigen, so they are useuful. Indeed, BioCarbon won the competition.
  • Artomatix builds a software to automating the generation of art, to enable digital graphic artists to focus on being creative, in addition to reducing project times and costs. Ok, it’s not a world changer, but the gamer that is in me loves the software, so I really hope they could have success
  • Solenica is building Lucy. Lucy’s mirror follows the sun and reflects sunlight into your rooms, creating a beautiful natural glow. Other than the product (they allow you to reduce your carbon footprint by up to 1 ton/year by saving electricity, I like things that help the ecology), I like the story of the startup, founded by 3 Italians. It’s sad they had to go to the U.S. to follow their dream, but I love their stubbornness in going forward. Only people as their go forward, and make the world a better place.

Conclusion

Other than the inspiration, in that week I had also the confirmation I’m on the right path to do something in my life to help the world to be a better place. A lot of people incited me to continue on that way, and you know, public recognition of your work is important.

Ciao,
R.

on June 28, 2015 10:56 PM

Availability

Stuart Langridge

Some very interesting discussions happened at Edgeconf 5, including a detailed breakout session on making your web apps work for everyone which was well run by Lyza Danger Gardner. We talked about performance, and how if your page contains HTML then your users see your interface sooner. About fallbacks, and how if you’re on a train with a dodgy 3g connection the site should work and that’s a competitive advantage for you, because your competitors’ sites probably don’t. About isomorphic JavaScript and how the promise of it is that your Angular website won’t have to wait until it’s all downloaded before showing anything. About Opera Mini’s 250 million users. It’s about whether the stuff you build is available to the most people. About your reach, and you being able to reach more than the others.

In the past, we’ve called this “progressive enhancement”, but people don’t like that word. Because it sounds hard. It sounds like you’re not allowed to use modern tools in case one user has IE4. Like you have to choose between slick design and theoretical users in Burma.

Much rhetorical use has been made of the gov.UK team’s results of people not getting the script on their pages. The important part of that result was that 0.9% of visits didn’t run the client side scripting even though they should have done. It’s not people with JavaScript turned off, it’s people with browsers that for some reason didn’t run it at all. Did you open a hundred web pages yesterday? I probably did. So for every hundred web pages opened by someone, one of them didn’t work. Maybe they were in a tunnel and the 3g cut out. Maybe they were on hotel WiFi. Maybe the CDN went down for ten seconds. Maybe the assets server crashed. But, for whatever reason, some of your site didn’t work. Did that make your site unavailable to them? Not if it was written right, written to be available.

And “written right” does not mean that you have double the work to build a version of your WebGL photo editor that works in Lynx. If you do this by having isomorphic JS, so your node server provides HTML which makes your pages load before your 2MB of bower JS arrives, that’s fine. Because you’re available to everybody; a Macbook user in a cafe, a finance director on her Windows desktop, a phone-using tween in a field with no coverage, and yes even Opera Mini users in Burma.

It’s not about giving up your frameworks to cater for fictional example users with scripting disabled. It is true that not everyone has JS and that sometimes that’s you, so let’s work out how to do this without regressing to 1998.

So I’m not going to be talking about progressive enhancement any more. I’m going to be talking about availability. About reach. About my web apps being for everyone even when the universe tries to get in the way.

(Also, more on why availability matters, with smiling diagrams!)

on June 28, 2015 02:20 PM

When I first left desktops behind for a laptop (Lenovo T500) it was a tough step. I was used to building my own desktops from the components I selected. I was used to the power of a desktop. Converting to using a laptop was an exercise in compromises. The transition from a 15″ laptop to a smaller lighter laptop is similar, but this is the first time I have taken a step back in the area of memory. I am converting from a Lenovo T530 to a Dell XPS 13 (9343) Developer Edition. This article will cover accessories I own or am considering purchasing to replace some of the lost features of the larger laptop.

main_256Video Out
If you use your laptop to present or would like to have a larger monitor at your desk then you will want to have an adapter from mini-displayport to some other input (VGA, DVI or Displayport). In my case I went with the MDP-HDMI from Puggable which converts from mini-displayport to HDMI. Most of the presentations I do get displayed on large screen television with HDMI inputs which make this solution ideal. Linux does not have support for USB 3.0 Display Link device, but you could also choose to utilize a USB 2.0 docking station. As long as you do not have USB 3.0 drives or a need for 1000MB Ethernet connections that would be a possible solution.

main_256Network
Wireless works great when you are mobile and fairly well even when you are not. For most people there is no need for wired connections, but if you move large files then having a gigabit connection is a must have. For this I use a Pluggable Model USB3-E1000 device. Moving large files at 118 MB/s is much more enjoyable than 35 MB/s.

main_256USB 3.0 Hub
With only two USB ports a hub can make it easier to attach multiple devices. In my case since I decided to use the USB3-E1000 device I would only have one available USB port. I have the Plugable USB3-HUB7A which has seven ports. This USB hub has not been stable for me with either the the Lenovo T530 nor the Dell XPS 13 (9343). I am not sure if there is a firmware issue or something else. The current issues are that devices plugged in to the hub are not always recognized. That said this hub still allows me to use three devices directly attached and another two through a second USB 2.0 hub.

 


on June 28, 2015 02:07 AM

June 27, 2015

Since 2014 I have been running static code analysis using tools such as cppcheck and smatch against the Linux kernel source on a regular basis to catch bugs that creep into the kernel.   After each cppcheck run I then diff the logs and get a list of deltas on the error and warning messages, and I periodically review these to filter out false positives and I end up with a list of bugs that need some attention.

Bugs such as allocations returning NULL pointers without checks, memory leaks, duplicate memory frees and uninitialized variables are easy to find with static analyzers and generally just require generally one or two line fixes.

So what are the overall trends like?

Warnings and error messages from cppcheck have been dropping over time and "portable warnings" have been steadily increasing.  "Portable warnings" are mainly from arithmetic on void * pointers (which GCC handles has byte sized but is not legal C), and these are slowly increasing over time.   Note that there is some variation in the results as I use the latest versions of cppcheck, and occasionally it finds a lot of false positives and then this gets fixed in later versions of cppcheck.

Comparing it to the growth in kernel size the drop overall warning and error message trends from cppcheck aren't so bad considering the kernel has grown by nearly 11% over the time I have been running the static analysis.

Kernel source growth over time
Since each warning or error reported has to be carefully scrutinized to determine if they are false positives (and this takes a lot of effort and time), I've not yet been able determine the exact false positive rates on these stats.  Compared to the actual lines of code, cppcheck is finding ~1 error per 15K lines of source.

It would be interesting to run this analysis on commercial static analyzers such as Coverity and see how the stats compare.  As it stands, cppcheck is doing it's bit in detecting errors and helping engineers to improve code quality.
on June 27, 2015 10:13 AM

The idea behind this video is to show Ubuntu equivalents for my most used Android apps.

Ubuntu apps shown:
Google+
YouTube
Gmail/Photos/Calendar/Drive
HERE / OSMtouch
Camera
Udropcabin
File Manager
CuteSpotify

Honourable Mentions:
OSMtouch
Telegram

EDIT: The small font bug in OSMscout (on the MX4) is now fixed. Yay!

on June 27, 2015 09:17 AM

June 26, 2015

 

FCM98-cover

Full Circle – the independent magazine for the Ubuntu Linux community are proud to announce the release of our ninety-eighth issue.

This month:
* Command & Conquer
* How-To : Conky Reminder, LibreOffice, and Programming JavaScript
* Graphics : Inkscape.
* Chrome Cult
* Linux Labs: Midnight Commander
* Ubuntu Phones
* Review: Saitek Pro Flight System
* Book Reviews: Automate Boring Stuff With Python, and Teach Your Kids To Code
* Ubuntu Games: Minetest, and Free to Play Games
plus: News, Arduino, Q&A, and soooo much more.

Get it while it’s hot!
http://fullcirclemagazine.org/issue-98
on June 26, 2015 02:53 PM

S08E16 – The Hottie & the Nottie - Ubuntu Podcast

Ubuntu Podcast from the UK LoCo

It’s Episode Sixteen of Season Eight of the Ubuntu Podcast! Alan Pope, Mark Johnson, Laura Cowen, and Martin Wimpress are all together again and speaking to your brain.

In this week’s show:

That’s all for this week, please send your comments and suggestions to: show@ubuntupodcast.org
Join us on IRC in #ubuntu-podcast on Freenode
Follow us on Twitter
Find our Facebook Fan Page
Follow us on Google+

on June 26, 2015 12:34 PM

June 25, 2015

The first Alpha of Wily (to become 15.10) has now been released!

The Alpha-1 images can be downloaded from: http://cdimage.ubuntu.com/kubuntu/releases/wily/alpha-1/

More information on Kubuntu Alpha-1 can be found here: https://wiki.kubuntu.org/WilyWerewolf/Alpha1/Kubuntu
on June 25, 2015 07:54 PM
Again, some changes in the GTK libraries made our theme looking wrong. This is a fix applied to GK3 apps with list boxes being greyed out making them totally unreadable (bug #1464349). See the differences before and after:




Also, some fixes in the core Ubuntu theme corrected the titlebar for Unity and the toolbar "continuity effect" in all environments. Before and after:



As always, you can upgrade or get your theme from the Artwork page. If you're a Wily Werewolf user or you added the PPA to your system, these changes will arrive soon.
on June 25, 2015 04:34 PM

It is a well known fact that social media platforms like Twitter help in marketing a business or product. As a blogger your success depends on the number of visitors to your site. But to make that happen is not enough to have a Twitter account and make the occasional tweets. You have to do more and here are some tips that can help you along.

Get influencial followers

Just like the real world, influence counts for a lot in the virtual world. For instance, if your niche is tech and security, then find the Twitter users in this niche. You can use tools for this also to find out the quality of their tweets and its response. Another way to find users in your niche would be to look for hashtags with relevant names.

What you should be looking for are people who are really active on Twitter, have a lot of followers and have a habit of retweeting content. You can become their follower and in time they will return the favor.

Be active

Do not wait for followers to come to you. This is not the forum to be passive. Find followers and retweet their content. You should make your presence felt. The only way to be heard is first to listen and then engage. Sticking with tech and and everything Internet, imagine someone trying to educate people on things such as Internet security, who would take advice from a random account they have never heard of?

Build your following and then prove your worth.

Information that you share should be unique but also useful

It should never appear that you are tweeting for the sake of tweeting. People are not fools, they will just ignore that and go to people who are sharing relevant and unique content. The content that you are posting need to be your own. You can and should retweet any content that is relevant to your niche. Sharing tweets is also a way of showing people that you are personally invested in helping them.

You can use tools that are available online which will set up auto tweets throughout the day. This way the tweets are spaced out properly without you having to remember that a tweet is due.

Never send spam to followers

This is the worst thing that you could do to yourself. So, never send auto-direct messages which are considered spam by Twitter users. If you do so you do it at the risk of making your brand suffer. But you can use direct messaging if you want to engage in a conversation.

Be consistent with your profile

If you have a presence in other social media too, then ensure that your profile is consistent across all of them. So don’t have one thing in Twitter and another type on Facebook and so on. Ensure that the pictures are relevant to your brand.

Analyze your replies

Any tweet from you will result in replies. See how many you are getting and which tweet gets more replies. This way you can make out which keywords are more popular and you can send more tweets with these words. Same way we analyze the replies that you send for other’s tweets.

Twitter is a useful tool if you know how to use it, so spend time in learning about it and you will reap the benefits.

The post Top Twitter Marketing Tips for bloggers appeared first on deshack.

on June 25, 2015 04:16 PM