{"Real Time APIs"}

Real Time Is Often More About What They Desire Than What We Want

There are many definitions of what exactly constitutes "real time". I find it is a very relative thing, depending on who you talk to. When asked, many will respond with push notifications as an example. Others immediately think chat and messaging. If you are talking to developers they will reference specific technology like XMPP, Jabber, and WebSockets.

Real time is relative. It is relative to the situation, and to those involved. I'd say real time also isn't good by default, in all situations. The need for real time might change or evolve, and mean different things in different industries. All of this variance really opens up the concept for a lot of manipulation and abuse.

I feel like those who are wielding real time often speak of the benefits to us when in reality it is about real time in service of what they desire. They want a real time channel to you so they can push to you anytime, and get the desired action they are looking for (ie. click, view, purchase). In this environment, the concept of real time quickly becomes just noise, distraction, and many other negative things--rendering real time to just often being a pretty bad idea. 

See The Full Blog Post


Making Web Concepts and Specs Present As Real Time Help In API Design Tooling

I took the Github repository for Erik Wilde's (@dret) Web Concepts work and forked it, then generated some JSON which I could use to import into my API monitoring system. I've been manually adding specs to my Tweet and LinkedIn scheduling system, but I keep forgetting to go back to the site and add more entries. So I wanted to go ahead and import all the concepts and specs, and schedule out the tweets and LinkedIn posts for everything, over the next couple months.

First I generated the JSON for the concepts:

Then I generated the JSON for the specs:

I left out the relationships between the concepts and specs, as I will just be linking to Web Concepts, and let people explore for themselves. As I was looking through the JSON I got me thinking about why these concepts and specs aren't available in API design tooling, as helpers and tooltips, so that API designers and architects can learn from them and be reminded in real time--as they are crafting their APIs. 

It seems like there should be autocomplete for HTTP header fields, and for HTTP status codes, and other relevant items as they are needed. There is a wealth of web literacy available in Erik's work, and across the web concepts and specs he has organized, it seems like these should be available by default within API design services and tooling, and start being baked into IDEs like Atom, Eclipse, and Visual Studio--maybe they already are, and I'm just unaware.

See The Full Blog Post


Fine tuning My Real Time For Maximum Efficiency

I am working hard to fine tune my world after coming back from the wilderness this summer. Now that I'm back I am putting a lot of thought into how I can optimize for efficiency, as well as for my own happiness. As I fire back up the old API Evangelist machine, I'm evaluating every concept in play, a process being used, and tool in production, and evaluate how it benefits me or creates friction in my world.

During the next evolution of API Evangelist, I am looking to maximize operations, while also helping to ensure that I do not burn out again (5 years was a long time). While hiking on the trail I thought A LOT about what is real time, and upon my return, I've been applying this to reverse engineering what is real time in my world, and fine tuning it for maximum efficiency and helping me achieve my objectives.

As I had all the moving parts of real time spread out across my workbench, one thing I noticed was the emotional hooks it likes to employ. When I read a Tweet that I didn't agree with, or read a blog post that needed a rebuttal, or a slack conversation that @mentioned me--I felt like I needed to reply. When in reality, there is no reason to reply to real time events, in real time. This is what it wants, not always something you want.

I wanted to better understand this element of my real time world, so I reassembled everything and set back into motion--this time I put a delay switch on ALL responses to real time events across all my channels. No matter how badly I wanted, I was forbidden to response within 48 hours to anything. It was hard at first, but I quickly began to see some interesting efficiency gains and a better overall psychological well-being.

Facebook, Twitter, Github, and Slack all were turned off and only allowed to be turned on a couple times a day. I could write a response to a blog post, but I wouldn't be allowed to post it for at least two days. I actually built this delay switch into my world, as a sort of scheduling system for my platform, which allows me to publish blog posts, Tweets, Github commits, and other pushes that were often real time, using a master schedule.

After a couple of weeks my world feels more like I have several puppets on strings, and performing from a semi-scripted play. Where before it felt the other way around, that I was a puppet on other people's strings, performing in a play I've never seen a script for.

See The Full Blog Post


The Real Time Device Software Update Certification Chain

Each device will push to cloud, or to intermediary some sort of credentials that latest vulnerability update was pushed.

See The Full Blog Post


Working To Avoid The Drowning Effects Of Real Time

One thing I'm experiencing as I come out of my Drone Recovery project is the drowning effects of our real-time worlds. I am talking about the desire to stay connected in this Internet age, and subscribe to as many possible available channels (ie. Facebook, Twitter, LinkedIn, RSS, etc.), and more importantly the tuning in, and responding to these channels in real time.

You hear a lot of talk about information overload, but I don't feel the amount of information is the problem. For me, the problem comes in with the emotional investment demanded by real-time, and the ultimate toll it can take on your productivity, or just general happiness and well-being. You can see this play out in everything from expectations that you should respond to emails, all the way to social network memes getting your attention when it comes to the election, or for me personally, the concerns around security and privacy using technology.

The problem isn't the amount of information, it is the emotional toll of real-time. I can keep up with the volume of information, it's once I start paying the toll fee associated with each item, that it begins to add up. I feel the toll fee is higher in the real-time lane than when you do on your own schedule. The people who demand I respond to emails, and be first to the story have skin in the game, and will be collecting a portion of the toll fee, so it is in their best interest to push you to be real time.

Sure, there are some items that will be perishable in all of this. I am not applying this line of thinking across the board, but I am prioritizing things with this in mind. In an increasingly digital world, the demands on our time are only going to increase. To help me to keep from drowning, I'm going to get more critical about what I accept into my world in a real time way. My goal is to limit the emotional toll I pay, and maximize my ability to focus on the big picture when it comes to how technology, and specifically APIs are impacting our world.

See The Full Blog Post


Making Scientific Research More Real Time And Collaborative Using APIs

I had heard about the Zika virus research that was going on at the University of Wisconsin listening to an NPR episode this last spring. I finally had the time to dig into the topic a little more, and learn more about where the research is at, and some of the software behind the sharing and collaboration around the research.

The urgency in getting the raw data and results of the research out to the wider scientific community caught my attention and the potential for applying API related approaches seems pretty huge. When it comes to mission-critical research that could impact thousands or millions of people, it seems like there should be a whole suite of open tooling that people can employ to ensure sharing and collaboration are real time and frictionless. 

As I dug into the Zika virus research, I was happy to find the LabKey technology employed to publish the research. I do not know much about them yet, but I was happy to see the open source community solution, developer resources including a web API for integrating with research that is published using the platform. There are plenty of improvements I'd like to see added to the API and developer efforts, but it is a damn good start when it comes to making important scientific research much more shareable and collaborative. 

I'll spend more time learning more about what LabKey currently offers, and then I'll work to establish some sort of next steps blueprint that would employ other modern API approaches to help ensure important research can be made more real-time, aggregated, interoperable, and shareable using technology like definitions, Webhooks, iPaaS, and other common areas of a modern API effort.

When it comes to research, scientists should have a wealth of open tooling and resources that make their work a collaborative and shareable process by default, but with as much control as they desire--something modern web API solutions excel at. I added LabKey to a new research area dedicated to science. I will spend more time going through space, and see what guides, blueprints, and other resources I can pull together to assist researchers in their API journey.

See The Full Blog Post


Moving Cellular Towers In Real Time Response To Where Cellular Customers Are #DesignFiction

The age of the mountain top and building cellular tower is coming to a close. Our real-time cellar drone algorithm is now in active use in over 3300 telco fleets around the globe. After studying the daily activity of over 100 million cellular customers were able to assemble a reliable algorithm that would guide fleets of cellular network drones to where they are most needed in real time. 

It just didn't make sense to have cellular network towers be stationary anymore when our mobile users are well...mobile. We needed our cellular network to move, expand, and grow along with our user base. Thanks to advances in drone technology our network of thousands of drones deploy, hover, migrate, and return home in real time response to demands.

Our initial implementations were all successful due to the hard work, and quality of our algorithm, but with the data we are now receiving from the 3000 drone fleets around the globe, our new Swarm Tower (TM) technology is insanely precise. Rarely do we ever see a network overload or a mobile customer who cannot get the bandwidth (they are willing to pay for) they need at any given moment.

What has really surprised us is the use of the network beyond mobile. Early on 90% of the network connections were consumer and business smartphone devices, where now over 60% of the network capacity being other consumer, commercial and industrial devices. Expanding our mobile networks to meet this demand would never have been possible using traditional cell tower technology.

If you are as excited about Swarm Tower technology as we are, you will be interested in our Q2 announcements around secondary drone activities while they are supporting cellular network activity -- things like surveillance, weather, and other common activities.

See The Full Blog Post