We are hiring: IoT app developers

weneedyou2017

App developer:

We are looking for an app developer to join us to work on exciting challenging software projects. Ideally the applicant will have a degree in computing science and be interested in technology.

EyeSpyFX provide IoT app development services and products to security camera and access control manufacturers. Our clients include many of the world lead security camera and access control manufacturers. We also have our own in-house range of software and hardware products to develop and maintain.

We take on difficult projects so it is essential that you are highly motivated and interested in technology generally. You will need to continue to learn new development techniques to allow you to grow and change with range of projects we deal with.

We would welcome CVs from students who will graduate in June 17 and from people with one or two years of experience.

Android wish list for 2017

Android 1.0. HTC Dream - 2008

Android 1.0. HTC Dream – 2008

When the first Android phones were launched it was unclear (to me at least1) how the ideas of “search” and “mobile phone” would come together. (crazy, I know!)

Fast forward to 2017, voice command and search integration with a security camera app might, soon, allow a user to say the commands:
“Go to camera 34,
go back an hour,
go forward 5 minutes,
go back 1 minute,
zoom in,
pan left,
jump to live,
switch to Front Gate camera”.

The voice commands would control an app which would chromecast to a big screen.

This vision is not exceptionally fanciful as many security camera apps can do all of the above today – except using a visual touch UI.

Voice commands and search are closely connected. A voice command is inherently vague. Search is a key computational mechanism used to interpret a voice command and find a best-fit reply.

There are just two barriers holding back the vision as outlined above: 1) in app search and 2) custom voice commands.
1) In app search is available only in a very limited sense at present. You can have Google index the app manifest. App functions then show up when you do a relevant search. This however does nothing to help search the user generated content within an app.
Google have tried search of data held on private computers before. In 2004 Google launched a PC application called Desktop. Google Desktop indexed all data on your PC. The project was closed in 2011 because Google “switched focus to cloud based content storage”.
2) Requests for custom voice actions from third party app developers are currently closed. (also the case for SIRI btw)

Custom voice commands - not yet (Dec 2016)

Custom voice commands – not yet (Dec 2016)

With both in app search and custom voice actions not being available it seems like the vision for fully integrated voice control of apps is not viable – for now.

If OK Google and SIRI continue to grow in popularity will the pressure for custom voice commands also be the catalyst for enabling in app search?

Voice actions and in app search could be (more easily?) achieved if you move the location of apps from the phone to a google/apple account in the cloud. An added advantage of apps in the cloud is that we could log on from anywhere and use custom apps.

Choose Google or Apple

Choose Google or Apple

With thanks to uber, maps, cheating in pub quizzes and countless other uses it is now clear that search and phones are a perfect match. It seems (to me at least2) that the next wave of development for search and phones will involve voice commands. Voice command based interfaces also seem to fit well with wearables and control of IoT devices.

To conclude, a seasonal wish list for 2017:

  • In app search for user generated data
  • Custom voice commands made accessible to third party app developers
  • Move the concept of apps away from the phone and onto a Google account. No more downloading.

Introducing Tiltmatic

Most security cameras video streams are landscape shaped and mostly phones are held in portrait position. This little mismatch tends to result in Security Camera Mobile apps appearing with shuttering top and bottom of a central video image. Of course you can orientate the phone into landscape for a more well placed image. However doing the landscape orientation manoeuvre is something we naturally resist and it is not easy if you are on the move.

tiltmaticregular

Most Security Camera apps: Shuttering top and bottom with the image in the middle

That is why we created “Tiltmatic”. Clicking the “Tiltmatic” icon maximises the camera stream to the full height of the phone. Going full height causes the full width of the camera stream not to be displayed. Tiltmatic solves this problem by bringing the rest of the image into view when you tilt the phone left and right. The left and right parts of the image roll into view when you tilt. If you tilt just a little bit the image moves over slowly. If you tilt quickly the image zooms to the far left or right position.

Tiltmatic gives you instant large screen viewing of the central part of the video stream while allowing the whole image to be viewed in a simple tilt interaction. It is a more sympathetic phone shaped solution to a classic design problem.

Tiltmatic

Tiltmatic: Full height security video streams in portrait format – tilt to view left and right.

You can try out “Tiltmatic” in our Viewer for Axis Cams app

Form Follows Phone

The mobile phone can be characterised as the product that has eaten everything. When the phone eats things the things do not die they just change shape. They become phone shaped.
My photo album, TV, email client, compass, map holder, calendar, address book, web browser and alarm clock are now all phone shaped portrait orientated rectangles.

My record collection has become a list on Spotify. I now share playlists! My diary with its flip over paper pages is now a scrolling list whose days appear as I swipe.

Will phone shapes continue to dominate?

There is an interesting trend where phone functions are distributed to a smart watch. This promotes a pause to think of a world that is post phone shaped – a wearable world.
There is also the rise of SIRI and OK Google to consider. Maybe the phone shape will not be so important if we talk to our computers or more fancifully if our phones guess our needs and talk to us. These ideas are for the future, yes, perhaps the near future but for now it is hard to see beyond a dominant phone shape – everywhere.

It is certain that much of the physical environment will morph into a mobile phone app. Early candidates for becoming phone shaped are home heating systems, access control, personal fitness and wellness. A clear trajectory is in place where the phone consumes many of our well know physical objects and activities. In this there is a sense of loss that so many things and activities of days past are now phone shaped apps. Time, perhaps for a short lament and then we need to face the challenge.
If things become phone shaped then they should do so wholeheartedly – without pretending to be something else or harking back. Of course we should recognize the limitations of phone shapes and also recognize the opportunities for wonderful new assisted social activities.

There will be a period of growing up in this augmented environment we now share with personal machines. In the early days (we are in that period) there will some poorly judged interactions. For example, just because you can share your toothbrush status does not mean that it is a good idea.

5 types of things?

Steve Sufaro of Axis Commnications proposed a three step test to define an IoT device. It is this:

  1. Is the device capable of being remotely detected; is there the ability to know what IoT devices and components are connected to a given network or system?
  2. Can the device become trusted and authenticated on a network?
  3. Is the device able to be updated and upgraded to enhance features, deliver data and improve device security?

Assuming that the test is good and that a particular device passes on all counts what further definition could be given to IoT devices? How should we think about things? What kind of language should we use?
It is clear that all IoT devices are communicating entities however not all IoT devices are equal. Different communication regimes pertain to different devices. Stratifying IoT devices by the sort of communication regime they operate in could help us think about things and the relationship we have with them. Here is our concept:

A plain old thing – fails all thee of Sufaro’s tests and is not considered to be part of the IoT. A roll of sellotape or a pot for example.

A local thing – the communications with this thing occurs locally – only. The device exists behind a LAN or some other sort of network (Bluetooth or a zigbee mesh for example). Detection and Authentication is completed within the local network. Updating can be achieved by using a proxy device such as a mobile app, which in turn connects to the device and updates it.
An example of a local thing is a Bluetooth controlled heater working in connection with a mobile app.

A wide thing – communicates with someone or something on the internet. The architecture for such a device often includes: the device itself and peripheral sensors, a cloud service including data sources, a user control and monitoring panel, often in the form of a mobile app. Home automation control hubs and cloud based security camera systems are wide things.
A simple example of a wide thing is a lamp that changes colour when a favourite football team scores a goal.

A swarm thing – communicates with other things in the network. A swarm thing acts like a single entity although it is comprised of many constituent entities. Some access control systems are swarm things. A city traffic light system acting in unison could be a swarm thing. A change in the state of one thing affects the state of other things in the network. A key feature of a swarm thing is that it enables the addition of a connection to another thing thereby increasing its functionality.

An autonomous thing – acts in a wide environment responding to cases it detects to achieve a set goal. The control panel for an autonomous thing allows the user to change parameters of the goal. The communication with an autonomous thing is primarily at the start of its life when it is set its task. Devices for seeking an equilibrium state in an environment could be autonomous things.

The divisions between the strata are not fixed or ranked. It is possible to envisage and autonomous local thing for example. The level of remote control, programmatic agency or artificial intelligence in the thing is not the critical stratification (things will certainly get smarter) it is instead the communications regime that any individual thing operates within that is the identifier.

 

 

EyeSpyFX introduce a new library for reading H264 Video.

For Network Camera and VMS Manufacturers who need to build a Mobile Solution SFX100 is a library of code that enables iOS and Android apps to be built that decode and display MJPEG, H264 video using RTSP over TCP, RTSP over HTTP and RTSP over HTTPS.

Unlike bulky Open Source projects such as ffMPEG, Live555 and VLC, published under GPL or LGPL, SFX100 is a proprietary library available under licence that is ready for immediate and efficient deployment in commercial mobile projects.

SFX100 is optimised for Security Camera Video applications uniquely offering a secure layer for streaming RTSP tunneled over HTTPS.

SFX100 is exemplified in EyeSpyFX premier iOS mobile app “Doorcam”. (https://itunes.apple.com/gb/app/doorcam/id1060661561?mt=8)

Key features include:

  • Secure layer for streaming RTSP tunneled over HTTPS.
  • Per project commercial licence
  • Optimised code for security camera video types
  • iOS and Android libraries available
  • Reads RTSP streams and provides mechanism to pass to phone based native decoders
  • Compatible with IPv6

Contact us on info@eyespyfx.com for further information about how SFX100 can be deployed in mobile apps.

App Prototyping Tools

In EyeSpyFX we develop apps for complex security cam and access control systems for international clients. We have often been asked what App Prototyping Tools we use.

Well – the truth is we don’t use any specialized “app prototype tools”. That doesn’t mean that we don’t iterate and prototype – we certainly do, it just means we don’t use those app prototype tools. Nor does it mean that we are against the use of App Prototype Tools – we are keeping an open mind, but our current process does not involve those sort of tools.

We start out with a briefing document and then we draw stuff out – by hand (no prototype tools). The drawings are loose and very often we do a lot of them – often 100 to 200 sketches.

System Diagram

Hand Drawn (low fidelity) System Diagram

At some point we scan the drawings and start to create higher fidelity images, often systems diagrams and then screen shots. The system diagrams are agreed with clients and based on these diagrams we build the technical backbone for the project.

The screen shots become storyboards and interaction walk-throughs for different personas who will use the app. These walk-throughs are checked against the system diagrams and gradually the screenshots mature and become graphic assets for the app development project. To do all this we use, Illustrator, In Design, Powerpoint, Photoshop, etc. Each of these are powerful software packages but general in nature and not specific App Prototype Tools.

We feel that this fluid process might become a bit constrained and over formalized if we used an App Prototype Tool – so we don’t and so far so good.

Access All: Door controller App for AXIS A1001

We are proud to launch Access All app for IOS. Access All allows you to remotely open doors using the A1001 Door Controller Unit from AXIS. You can also use the app to view access reports.

As an introductory offer Access All is free to download and use.

Access All door controller app for AXIS A1001

Access All door controller app for AXIS A1001

App features:
> The Door List
– View all your doors and their current status all on one list.
– Individual colour coded status for Locked, Unlocked, Access and Alarms.
– Includes an Instant Access button to unlock the door for a number of seconds, granting people entry.

> Door View
– Lock, Unlock and Access the door.
– See the current status of that door.
– View the door’s Event and Alarm logs.

> Event Log
View the Event Log for each door.
See the events from the door organised by date and filtered for your convenience.

> Alarm Log
View the Alarm Log for each door.
See only the Alarm stats that you pre-set up alerts for.

>Adding your A1001 to the App
The App works with any A1001 device, whether it’s in Standalone Mode or part of a System.
Either auto-detect the A1001 devices on your network or manually enter their location.
Then enter your your username and password to allow the App access the A1001 unit.

IoT: Three types of Things

Could IoT things be classified according to their instruction source rather than their function?

Local Thing – instructions derived on board
Global Thing – instruction derived from a global source
Swarm Thing – instructions gained from other things in the swarm

Things can be combinations of the above and/or switch according to context.