BBC News iPhone App Logo
From October 2015 to September 2016 I spent a year building two new features for the BBC News iOS app. The first a ground breaking portrait news video feature and the second the internationalisation and localisation of the app into three languages as part of the largest expansion of the BBC World Service since the 1940’s.

Vertical Video






The vertical video feature launched to the public in November 2016 is an evolution of the product to satisfy BBC news audiences that engage primarily on a mobie phone or tablet. A growing market I am sure everyone agrees.

I designed and implemented the data source and events architecture of this feature and it turned out to be a case of something old and something new. Ok, maybe something old and something older!

Data Source
BBC_News_Top_Stories_iPhone_Screenshot

If you haven’t seen the feature (take a look at the video above) - it is a series of videos viewed with the phone held in a portrait orientation. The videos fill the entire screen and a swipe from right to left takes the user to the next one, and a swipe from the left to right takes the user back one. It seemed obvious that the requirement was to display a list of videos, in the same way that a UITableViewController displays a list of cells, and should be backed by a data source in the same way.

The data source implemented for this feature supplies video cards in the same way that a UITableViewDataSource supplies cells but also handles asynchronous loading of images, related stories, and handles all of the state associated with those activities as well as card types, as there are bookend cards at the beginning and end of the sequence which behave differently.

As the number of video cards this data source needs to hold is relatively small, about 10, a beautifully simple interface could be created around it and most of the heavy lifting performed by NSPredicate.

Bubbling events
BBC News iPhone App Portrait Video Screenshot

As more of the feature’s architecture was built on top of the data source the relationship between the video cards and the data source started to remind me of things I had built with RobotLegs and PureMVC (which I see has a Swift port) in my Flex days. As you probably have worked out, this is the something older part. :-)

I started to notice a pretty common pattern in UI architecture. You have View Models / Value Objects / Model Items (depending on your approach / philosophy) and these are used when creating a unit of UI, such as a cell. On top of this the user may need to be able to perform a task on that UI item at some indiscriminate time in the future after the UI item has been instantiated. One such case we had is View Related Story and in order to respond to the user action related to this case more information is needed from the data source. You can very quickly get in to a situation where the data source knows about the UI and the UI knows about the data source, which is not a very good way of going about things - as I am sure you are already aware.

BBC News Portrait Video Player iPhone
A way of out of this is to say the data source can know about the UI, in fact it is very likely to be responsible for creating them, for eg. tableView(_:cellForRowAt:). The UI on the other hand should be as dumb as possible and what it does when something happens, for instance the user taps something, is dispatch an Event. This is handled by a class in the logic layer that knows about the data source and the UI retrieves the necessary data and injects it directly into the UI.

Events in iOS are defined as objects sent from the hardware to your app, I am not talking about these types of events and I am definitely not advocating NSNotification for the task. I am talking about Event objects that are dispatched and observed within the same app, in this case within the same feature and we used the delegate pattern to pass Events back up through the class hierarchy. I suspect a very simple Event Bus, tied to the feature, would be a little neater.

Even in these days of popular modern frameworks which more often than not seek to define how objects talk to each other it’s nice to still be able to rely on well established techniques that have stood the test of time.