A first responder’s guide to roadmapping while the earth is shaking
By: Nadav Levy, Senior Product Manager
Working within a fast-past environment, as a product manager, has taught me a lot about rocky roads and prioritizing while trying to run fast and deliver quickly. In this article I will attempt to outline three critical steps that every product manager goes through, and my experience working in a high-pressure environment.
To give some context, Edgybees provides situational awareness solutions by augmenting live video streams with contextual information. Our solution is currently being used around the world by first responders, governments and industrial personnel in handling highly dynamic and life-threatening challenges.
Collect operational information
Every roadmap should start somewhere, and no matter if it’s your first day in the company or your third year, always begin by sitting down with your company’s leadership and hearing where they see the company heading – also known as company vision. I strongly encourage you to start by meeting people individually. Each one will have a unique perspective you can learn from!
Pumped with excitement, it’s now time for some groundwork. Schedule time with each team and start collecting every task, major issues, wish-lists and features you come across. At this stage you’re mainly going to see patterns, technical debt and what is currently in the pipeline.
Now that you know where you want to be and where the company is today, fill in the gaps! Create Initiatives and Epics that will drive you towards the vision and don’t be afraid to bake in plenty of research tasks. In Edgybees’ case, working with computer vision and on the front line of GIS (Graphical Information Systems) along with augmented reality, we deal with a fair share of unknowns. A good mix we found working for us is leaving around 30% of our roadmap for research tasks.
“If I only accomplish ONE thing, what would that be?”
Finding your north star when everything around you is moving is not an easy task. Start every roadmap process by asking yourself (and stakeholders) what is the single thing you want to accomplish. This question can make managing decisions much simpler. Then, one step at a time, and only after you are sure that the accumulated effort is doable, you can go on to identify and add the next challenge to the list.
As far as KPIs and OKRs, try to list four to six objectives. We found that to be the sweet spot. If you have more it means you likely lost focus. If you got less, maybe you’re not breaking down goals or milestones well enough.
Once you have your objectives, identify the low cost / high reward features. If you need help figuring out the value and cost, ask your stakeholders to score them on a scale from 1-10 and average the answers.. Then, simply arrange all the features on a cost/value matrix.
As anyone knows, working at a startup is an endless battle to keep delivering and be on track when the road never stops moving. Business opportunities shake you up like earthquakes and stakeholders knock you side to side like hurricanes – Working with first responders, I can’t help but think of natural disaster metaphors when trying to deliver our roadmap. The only way to survive is by adopting the phrase: “Be flexible or break”. Many times, you will come out with a solution that’s far from what you set out to create.
At Edgybees we found 3 elements that work well in managing storms:
1. Keep the roadmap alive. Don’t just communicate your plan twice a year, but actually keep it open during every sprint planning meeting and ask: “does this story or epic have anything to do with our objectives?”. Yes? Great! But if the answer you get too often is No… it’s time to review your roadmap.
2. Keep a stable core. When a new opportunity (or basically a distraction, until proven otherwise) comes in, don’t stop your sprint. Create a Task Force including only the necessary teammates with one person acting as Owner. Open a side Kanban board for the occasion and no matter how large or small the task force is compared to the sprint team, you still have a burning candle. As the project starts to calm down, you can gradually shift people back to the normal cycle without losing your rhythm.
3. Plan for the unexpected. Always leave 10-20% of your sprint open for surprises. This will help close a quick loop for those on-the-fly requests. Also, leaving some room for unplanned tasks creates a great way to mitigate some of the pressure from stakeholders.
To summarize and leave you with a short cheat-sheet, here are some bullet points discussed:
Hope that helps and keep safe!
Command & Control Centers, also known as “Operation Center”, are the beating hearts of many organizations and industries. They are equipped by both hardware and software, and used by organizations to manage critical events as they happen. Industries known to be using these include emergency operations, military units, oil & gas and many more.
Military operations and emergency services need the Ground Control Station as it encompasses all the military tactics that use communications technology.
Private enterprises need operations command center for the entire IT network in order to manage the IT infrastructure.
Oil, gas, and power companies need command and control centers from which to watch their resources, maintain safe operating conditions and respond to emergencies
How do they operate?
Great amounts of data is continuously collected and streamlined in the means of video footage (CCTV, UAS, Drones, Mobile), maps, team positioning, incident reports into the stations. The data is then analysed and communicated to the teams at all locations.
As data transmissions are growing exponentially, so is the need to make sense of it all. The bigger difficulty is sorting through and differentiating between critical and non-critical information . Who should be receiving, who is sending, and what can be done with all of this data?
What about real-time?
Understanding the data in real-time, in order to make quick decisions, is also important. This is especially critical during emergency situations of disasters like fires, or army missions.
The Solution: seeing everything in one central place
The ideal solution is to streamline the data in real-time, in one place, where everyone can see it and react accordingly. The Command & Control Center is a perfect hub for such technology – which is where Edgybees comes in.
At Edgybees we are able to take any data and layer it over any live footage, to create an unprecedented view of any scene. Teams on the ground can both receive and send information, marking any points of interest, communicating back and forth to enable precise situational awareness. This is all done in real-time, under real live conditions, centralizing the operation.
Our customers testify that Edgybees can actually reduce the number of systems that are at work, while maintaining unprecedented situational awareness. The decision making process also becomes much quicker and more accurate, given the fact that there are less screens to look at or maps that are overloaded with data that needs to be analyzed.
Does your organization have a Command & Control center? Which technologies are you using and do you lack real-time situational awareness?
Contact us today for more information
In this modern economy we have all come to depend on the oil and gas industry for our livelihood. They not only help fuel our cars and heat our homes, but also produce the plastic we so commonly use as well as the many lubricants, waxes, tars and asphalts.
Refineries and plants spread across the world including the USA, Middle East, Russia, China, and Latin America testify to one challenge – the need for real-time situational awareness.
While extracting oil and gas there are a few challenges concerning plant managers:
To manage these situations, plants take various precautions that include:
1. CCTV cameras set across the location
2. Drones and UAV monitoring
3. First response teams and regulations
4. Security personnel
The data streamlined from these sources is gathered at a central Command & Control station, where 24X7 monitoring takes place, while equipped with state of the art hardware and software. These include large monitoring screens, communication platforms, analytical tools – all to make the process a little easier to control by the personnel.
Layers of critical information in real-time
Edgybees augments live video feeds with precise geo-information layers to provide oil & gas plants immediate situational awareness. The video can be captured from any camera, human input or other data sources and help understand the situation, communicate immediately, and take control.
Let’s examine one example scenario, to see how the Edgybees solution works:
It is 10 in the evening, the C&C station receives alerts of a possible pipe leak, change in pressure or heat. The staff immediately look at images coming in from the CCTV cameras, and also get a drone up in the air for a closer view. The problem is detected, and all information is layered on the video in the form of shapes and text, along with the maps and distances. The information is then streamlined to the teams on the ground in one clear lie video which includes all details. The first responders can now view the live visual data on their mobile devices, and their location is also layered on the video. Staff on the ground, at the command station, in the air can immediately correspond to get a hold of the situation until it is concluded.
This example is one of many, demonstrating the time saved along with a full clarity while managing critical situations.
For more information on the Edgybees technology, contact us today!
By: Lee Kaplan
In the 1950’s during the cold war, the United States Army had the need for an aerial reconnaissance tool that could see through clouds or fog and at any time of day. Their solution was SAR, short for Synthetic Aperture Radar. The technology has become one of the most important tools for viewing the earth from a long distance with a high degree of accuracy, and is being used today in a wide variety applications. Since the 1950’s much has changed technologically, and the field of computer vision has evolved significantly to the point where it’s easy to see how with the right people, these worlds can combine into something really special and innovative.
SAR images are hard to understand. They don’t look like ordinary pictures, so now in order to understand a SAR image, one person is required to be highly trained. This limits the possible dissemination and value of SAR images to a very limited audience. We at Edgybees can make it easy to understand SAR images even to the average user. The solution? Augmented Reality.
Imagine a large fire causing structural damage to a building, with other buildings in the area having already collapsed as smoke impairs visibility. SAR can “see” through smoke, allowing the user to assess the situation. If we could give you a blueprint of the buildings overlayed over the SAR image you could immediately see where it aligns with the walls and where it doesn’t, so you get an understanding of what’s going on with the building.
Essentially, we are able to overlay an augmented reality layer on top of live video and give context to complex situations. This principle applies to SAR quite effectively, giving the user a visual aid to understand satellite imagery to the fullest extent.
Whether it be first responders, military, homeland security, energy, or broadcasting, the merger of augmented reality and SAR is an exciting prospect, and Edgybees is proud to be the industry leader in this space.
On November 12, 2019, the Greater Sydney region in Australia declared a state of catastrophic fire danger as flames engulfed massive stretches of land. As of mid-January 2020, 46 million acres were burned down, including over 2000 homes, 48 facilities and another 2000 outbuildings. Over 30 people have been confirmed dead thus far and the University of Sydney has estimated that approximately 480 million animals have been killed, with perhaps even entire species being wiped out.
The NSW fire service has been on the case for months now, fighting the fires and risking their lives for the safety of others.
Edgybees has been cooperating with NSW, providing the team with a cutting edge augmented reality solution.The Edgybees solution is an Augmented Reality platform that creates a computer generated overlay on top of video in real time. NSW utilized the technology to great effect in their firefighting operations, viewing the scene below from the perspective of a drone.
NSW was able to use the Edgybees platform pinpoint the location of the fires, burnt structures, roads, teams on the ground, or even civilians and marks their location on a virtual map. The team was able to get a good bearing on their surroundings, and focus their efforts on the job at hand.
We would like to thank NSW for their brave work fighting the fires in Australia, and we wish them the best of luck. We have your support!
And don’t forget to follow us on Twitter and Facebook for updates!
Fires are raging across Australia, and first responders need more than just the same old information to fight them. That’s where augmented reality software company Edgybees comes in. Kim talks with CEO and co-founder Adam Kaplan about how AR is helping first responders make more informed decisions that can save lives. The company takes live footage and overlays information that helps firefighters see, in real time, what they’re up against.
Watch the full interview:
By: Oren Cohen
Two years ago, I found myself arriving at an interview inside a renovated “chicken roost” in a village.
My first impression of Edgybees at the time was, ‘I love the idea! I hope we get seriously funded.’
I started as a Software Development Engineer In Test, which is a fancy title for the person responsible for Automation and also for building the QA process from zero.
I spent countless hours drafting test documents, flying drones, finding bugs, verifying the developers fixed them, and drinking lots of coffee. I loved it. I never flew a drone prior to working at Edgybees. I did, however, have experience and recommendations for doing an excellent job in QA and was also in my last year of Computer Science studies. Everything fell into place, and life threw challenges at me day in, day out.
Fast forward to 2020. I’m a Backend and Platform Engineer. Responsible for some modules in our backend suite, actively supporting live events, creating internal courses for the company’s staff, and still drinking lots of coffee.
During this journey, I had the privilege of working on software that is meant to help first responders save lives. I’ve always considered myself as the type of developer that creates video games. I’m grateful for the enlightening experience that taught me hard, but valuable lessons, for any developer.
Before we begin, here’s lesson number 0: The amount of knowledge I gained over two years of being a part of a start- up is impossible to learn in any large high-tech organization with bigger teams.
Now, let’s dig in!
This point is first for a reason. How do you react to team members criticizing your work? Are you generally curious about what you could have done better? Are you seeking to learn from the more experienced people in your team? Or are you sulking for the rest of the day, thinking to yourself they don’t know what they’re saying?
If you are doing the first, congratulations, you are in for a whole lot of learning and a skill sharpening ride. That attitude will take you far and up, while your skill set will continue to grow every day.
However, if you’re not taking that road, it’s going to be challenging to learn anything new – especially when you’re always right, and everyone else is always wrong or doesn’t understand your reasoning. Don’t you think?
Leave your ego where it won’t interfere in your daily office activities, and watch your knowledge grow. Your peers will respect you for the willingness to learn and ability to see their perspective as well.
The business of saving lives isn’t laid-back. Usually, windows of opportunity are short, and the opportunities themselves are stretching the team’s capacity. That’s why it is super important to be proactive. Do you see a critical bug? Fix it. Don’t wait to be assigned a task to do it.
I know how sometimes there are tasks we don’t want to fall into our plate. “Could someone else do it?” was a recurring thought for me at the beginning of 2018, when I just started writing business code instead of QA Automation.
As we grow and experience crisis and hardship at work, we understand these lessons. If I had just fixed it when I found it, I wouldn’t be working on it at 1 AM.
Don’t give up on doing the right thing because of laziness or procrastination. Let’s face it — we both know that you know when a bug is under your jurisdiction.
When saving lives is on the line, we don’t waste precious time.
But, the flip side of this is not to take on something you’re not familiar with that will take even more time to solve.For example, at Edgybees, we deal with Augmented Reality. But, I wouldn’t know the first thing about fixing a bug in any related algorithm. I deal with the backend and platform. That’s where my power is. I don’t stretch it into territories that will waste our time.
Know your strengths and responsibilities and act on those without permission. If you’re independent, you can keep growing on your own.
When saving lives becomes your responsibility, it’s stressful.
Everything you do becomes ladled with meaning. Will the change to the Platform structure affect our Computer Vision modules? Will a change in the backend break the front-end? When is this coming out to Production? Will QA have enough time to test?
Endless questions! You can’t stop thinking about what it means if you break anything.
It doesn’t lead anywhere, believe me. I do have a suggestion, instead.
Trust the structure.
Whichever code you want to merge into the project will be reviewed by a team member. That code should also pass unit and integration tests. After clearing those, it should also get a green light from QA.
Only then, your code will be on its way to actual customers.
So why worry? Do your best work. Make sure that the reviewer is familiar with the project you’re working on, and there are tests to validate that work.
And what to do when that structure is partial or non-existent? That takes us back to our previous point — invent it. Don’t wait to be tasked with creating unit tests for your code. I will even go as far as saying that if you’re building something completely new, budget the time for writing these tests within the task estimation.
Just don’t think about the critical what-ifs. They won’t help your progress in any way.
There’s nothing I love more than shouting “YES!” in the office when something I worked on — small as it may seem — works as I intended.
It might seem childish to some, but it keeps me motivated.
Another way to get excited and motivated is to break down a big task into smaller tasks and then start to chip away at them.
Here’s a short example:
• Write a new Module for the Kubernetes cluster that does X.
• Write Handler 1 of the module.
• Write Handler 2 of the module.
• … Write Handler N.
• Dockerize the module.
• Integrate with DB.
• Integrate with Kubernetes.
• Drink a glass of Hot Chocolate and marvel at your work.
Do you see it? Now, instead of one foreboding task, you have many small ones that are a lot easier to accomplish.
When you cross off these tasks, celebrate this fact. Acknowledge the progress you made.
If you don’t do it, you’ll be bored. You’ll lose interest quickly and start falling into the “what am I doing with my life?” adage.
You’re not a librarian at an empty library. You’re a software engineer, and you learn something new every single day. Your work is going to affect many people. No matter if dozens, hundreds, or even millions. Your work will change lives. Act like it. Be willing to celebrate the process.
You’ll thank yourself later (and me).
Being a Team Player is probably the most crucial point in this entire story. We already talked about structure. Your team members help keep your work in a pipeline before it reaches customers. But, that’s not all being a team player means.
You see these people every day for more hours than your own family. If a family is defined by the amount of time spent with someone, your team is your family.
Don’t you think it’s important to know a thing or two about these people? To know you can rely on them? Or perhaps to let them know they can depend on you as well.
It doesn’t matter what your business is. The software can range from casino software to first responders — You’ll always have someone who did the research, or has much more experience than you in the field. You should feel comfortable asking them questions and learning from them.
Being a team player doesn’t put full responsibility on them. Yes, they do need to be courteous and answer your questions or refer you to helpful resources. But, what many inexperienced developers take for granted is that a team player should also feel comfortable asking questions and saying the three magic words: “I don’t know.”
In my business, if you need help, you ask for help. We don’t have time to deal with personal insecurity that will lead to much more time consumed than a knowledgeable team member will present.
Don’t feel comfortable asking for help? That’s when it’s time to challenge the structure. Talk to your boss about changing the work environment in a way that promotes healthy communication between team members.
A business without healthy team communication is not going to last long. Make sure to be proactive about improving that communication in the companies that employ you.
After all, a strong team is a win-win for everybody.
We’re happy to announce that Edgybees will be exhibiting and attending at a number of events over the next couple of months. Among these are:
DGI 2020, the global geospatial intelligence conference in Royal Lancaster London between January 20th to the 22nd.
2020 Ourcrowd Global Investor Summit, one of the fastest-growing tech conferences in the world taking place in Jerusalem, Israel on February 13th.
The Singapore Airshow, Asia’s largest aerospace and defence event at the Changi Exhibition Centre between February 11th to the 16th.
Verizon 5G Experience Day in 2020. We’re excited to show off our innovative augmented reality solution at these shows, as well as more to come during the new year.
Make sure to stop by! Be sure to contact us and schedule a meeting too.
Don’t forget to follow us on social media to keep up with updates from the events.
We’re happy to announce that Edgybees will be attending CES in Las Vegas, Nevada on January 7th-10th, 2020. CES is one of the most prominent technology conventions in the world, and Edgybees is ready to deliver with our cutting edge Augmented Reality solution. Make sure to stop by our booth and check out our exciting new demos and presentation.
Planning to attend? Make sure to contact us and schedule a meeting.
With 2019 coming to a close, we took some time to interview Edgybees’ CTO about kicking off 2020.
Question 1: How has Edgybees delivered on its ambitious AR promise and what are some 2020 goals in this area?
Answer: We’ve gotten a lot of mileage, even though we’ve only scratched the surface. We have already achieved accuracy levels of down to one meter, which was the goal we set for ourselves when we set off, but as we the saying goes ‘with the food comes the appetite’ and now we want to drive it even further. We want to be more robust and be able to achieve this even under poor conditions in regards to lighting, obstructions, scenery, or changes between the reference material. One KPI that we feel we are in a good place with is in latency where we already achieve something between 100 – 120 milliseconds, but anything else we’ve done the first 80 percent and now we need to do the next 80 percent.
As for some 2020 goals, as I already said, we want to reduce the dependence on reference imagery. We’ve already started developing the algorithms that will allow us to be self sufficient and be able to function even if we arrive at a new location where we have limited/outdated reference imagery, or no reference imagery at all. We want to strengthen the machine learning side of the house and allow us to understand what we are looking at, specifically with regard to structures, buildings, trees, and so on and make use of these to achieve better accuracy and better augmentation alignment with the scene. We want to be able to understand 3D scenery better so if somebody is on the third floor of a building we’ll be able to know that, if there’s a building of 20 meters height we will be able to understand that too. We want to extend to supporting new materials: currently most of our work is on electro optics visual and infrared, and we want to extend into radar, specifically synthetic aperture radar (SAR). We also want to extend our offering beyond the defense market to industrial and commercial applications, in particular agriculture, infrastructure, oil & gas, mining, and potentially also broadcasting and sports.
Question 2: You mentioned integrating SAR technology with Edgybees, how does that element come into play here?
Answer: SAR images, for lack of a better word, are weird. It doesn’t look like anything else, so currently in order to understand a SAR image the person is required to be very highly trained and professionalized, which limits the possible dissemination and value of SAR images to a very limited audience. We at Edgybees however, make it easy to understand SAR images even to the average joe so now those images have value to a much wider range of users. Couple this with SARs unique ability of being able to see through clouds and other obstructions and see objects on the ground even when lacking visibility to them. Now that we can explain to the users what it is that they are seeing you get a very unique offering.
Question: Could you give an example of a real use case where this could help out?
Answer: I’ll give you 2 examples: Let’s say there is a very large scale fire that caused structural damage to building, some buildings have already collapsed but you cannot see this because the area is covered in smoke. SAR sees through smoke so the user can see what is still standing erect, and which have already toppled down. However due to SAR’s weirdness, when you see the image you cannot really understand what is going on. If we give you a blueprint of the buildings overlayed over the SAR image you could immediately see where it aligns with the walls and where it doesn’t, so you get an understanding of what’s going on with the building.
Another example is for military applications, where you have a camouflage net that is blocking light and there is some equipment underneath it. A commander who wants to communicate with his platoon and assign one team to a certain target, and another to a different one, would need to chatter and explain it to them in a very complicated and hard to understand way. If each of the targets have a designation code and the teams have this picture in their hand, it’s very easy to understand now I need to take target number 4 or 5 or whatever.
Question 3: What is geo-registration and how do you use it for your ends?
Answer: When you get a video feed, or let’s say a single video frame that we want to add GIS information rectified information into. In simple words I want to put things that have a location in the world into the image accurately. The image usually comes from the aircraft with some position information and this position information describes the location of the aircraft. It is important to note this does not describe the position of the thing that is being filmed but the aircraft itself – the GPS location of the aircraft and the orientation of the camera. All this positioning data is highly inaccurate. GPS is inaccurate, we’re talking about plus-minus 6 meters in the optimal conditions on the horizontal level and up to 40 meters on the vertical. Many people are often not aware but GPS is extremely inaccurate on the vertical level. The orientation of the camera could have an error of up to 10 degrees and the pitch censor could have an error of up to 3-4 degrees. Now if you combine all these, the error we could be encountering could be tens of meters away on the image plane, which of course means that if I try to highlight a specific location, let’s say a junction of two roads I could be highlighting the next junction over. The errors are very big, so in order to have valuable rectification of information over the video we need to have a much better understanding of where the image really is. We do this by aligning the image to reference images. This process is called georegistration and it is an incredibly valuable tool that allows us to perform some of our most important functions at Edgybees.
Question 4: What are your thoughts on the competitive landscape from a technical perspective?
Answer: There are other companies and government institutions offering some of the same functions that we offer, but we don’t see anything competing with us on the same technologies to offer said functions. In simple words, we see them use hardware to add augmentation into the video, we don’t today see anybody using video processing in order to add visual indicators into the video. In this case we are unique because we are agnostic to the video coming in. SAR is a good example of this. We do not require any modification to the payload – the camera or the aircraft – and we can work with video coming in from very light-weight platforms where you simply cannot add additional instrumentation on the hardware. We are not aware of any direct competition, but we are keeping our eyes open because something will turn up sooner or later.