At Skeddly we’re focused on bringing you the best in AWS help tutorials, AWS scheduler services, and AWS backup services. However, from time to time we like to reach out to other leaders in the AWS space to help you, our blog readers, stay on top of the latest developments and news within the AWS ecosystem.
Today Skeddly had the opportunity to speak with a few of the bright minds behind Keyhole Software, a cutting edge technology company with an impressive amount of AWS experience. Below, we talk about the approach they used to build their dev studio from a 2 person show serving only one client to a 100 person team serving over 25 enterprise clients.
Without further ado, let’s jump into the interview.
Hi and thank you for joining us today. Let’s have you jump in and tell us a little bit more about your past. Tell us a little bit more about Keyhole Software history and how you got to where you are today.
Thanks for asking our team to be a part of this!
Keyhole Software was formed as the result of the friendship between company founders Chris DeSalvo and David Pitt. The pair met when previously employed by the Kansas City division of CrossLogic Corporation, an application development consulting firm.
Through superb technical skills and the dedication of its employees, CrossLogic was successful. In a logical match, the company was acquired by Number Six Software, and then again by Advanced Technology Systems Corporation (ATSC). With the knowledge gained through their stellar experiences, the pair spun off in 2008 and created what is now known as Keyhole Software.
Our goal is to continually keep an eye on what is viable for our clients and ensure we have the competencies to best serve our clients when they look to move that direction.
AWS is the leading cloud service provider, garnering more market share than the next three cloud leaders combined. Why do you believe that AWS is the go-to cloud solution for small startups and big businesses alike? What does AWS do that the others don’t?
The AWS brand is well known. It is often the first option mentioned by our clients when discussing Cloud Computing options. It’s relatively easy for new customers to understand, and there is prolific documentation and “best-practices” available to get started.
You also do readiness assessment and road-mapping. Can you tell us more about what types of projects need readiness assessment and road-mapping, as well as what major client oversights you find when you begin this process?
We do! The thing is that even a company that updated its tech stacks as recently as a decade ago can run into a slew of problems. We often see things like:
The challenge is that it’s easy to say “let’s move to the cloud” and it’s another to create a strategic plan and ensure that the modernization effort is performed using best practices. Enterprise applications can power entire business units, so any change to them should be well documented, communicated, and understood.
Our first step is a careful analysis of current application architecture, tooling, and environments. This can include interviews with members of the client’s technical team to highlight architectural constraints, current practices, and blockers. We ask questions like:
We produce a document with identified findings, including a summary of suggestive approaches that can deliver the most value to the organization (lift-and-shift, re-architect, re-platform, or others).
One caveat that clients sometimes don’t think of in custom development is that current development personnel must be educated to be successful with any new technologies implemented. Recently, Keyhole consultants led an initiative at a large financial services firm to move a monolithic, legacy Java application running in the organization’s on-premise data center, to a microservice-based suite of applications running in the cloud. The application was iteratively moved to containers, orchestrated by Kubernetes in the AWS Cloud.
If the client teams had only worked in that legacy Java environment with a monolithic application, working in a microservices environment is a whole new world filled with many more independent, moving parts. It was important to teach the team to be successful with Docker/Kubernetes/containers, microservices, AWS, DevOps, orchestration, and cloud techniques. We provide a variety of education services like lab/lecture courses, informal exercises, mentoring, etc. to help.
There are so many interesting things you can build on AWS. The possibilities are almost limitless. In the case of Keyhole Software, what are the main types of projects that come through your door? What’s your bread and butter?
We build full-stack applications for enterprise customers as well as for internal software automation. Server software for these applications are configured and managed using Docker-based containerization. Generally, we either use ECS - or, if more complexity and scalability are required, we will deploy an OpenShift platform management to AWS EC2 instances. We commonly use services such as S3 and RDS for data storage requirements, the simple email service is used for email support. User management is accomplished using IAM. For batch processing and ETL type requirements, we utilize Lambda services.
Speaking of interesting things you can do with AWS let’s talk a little bit about AI. AWS now has artificial intelligence capabilities, allowing developers to incorporate AI into their websites and applications. Can you tell us a little bit more about these AI tools and how you’ve seen people use these tools in creative ways within their own applications?
We have not directly engaged the AWS Machine Learning services, but we have built proof-of-concept applications using Tensor Flow and homegrown neural net image recognition applications. We’ve invested in learning ML in order to gain a better understanding of the technology so we could help our enterprise customers in identifying use cases and adoption.
AWS also has a huge suite of real-time analytics tools for big data processes, data warehousing, dashboard creation, interactive analytics and much more. In your experience, what are some of the most valuable yet overlooked AWS analytics tools?
In our experience, the most overlooked tooling seems to be the foundational aspects of big data and analytics. Most of this relates to the creation of Data Lakes and the ETL/ELT processes used to fill up the lake. Oftentimes clients want to skip to the end and focus on all the cool stuff (AI, dashboards, and interactive analytics, for example) before they have a fundamentally sound data store to ask questions against and glean effective insights from. The Data Lake Formation tooling helps with these building blocks and removes the complexity from the process so that true BI and OLAP activities can be performed reliably.
Tech is advancing so quickly and things like big data, wearable tech, AI, blockchain and IoT are making huge waves in the media. AWS has done a good job of keeping up to date by offering services compatible with this wave of new technologies. However, how much of this buzz is making its way into actual products? At Keyhole Software are you seeing a lot more of this type of work, or projects with these newer technologies embedded within?
We have taken a deep-dive approach in anticipation of customer adoption of blockchain technology. We have specifically focused on open-source frameworks for permissioned blockchain networks, like Hyperledger Fabric which is hosted by the Linux Foundation. (We feel this is much more likely to be applicable to our clients’ business needs as all users are known, has authentication, etc.) We have built our own Hyperledger Fabric blockchain network and even had one of our open-source tools accepted into Hyperledger Labs. (See an open-source reference implementation on our Byzantine Tools.)
We’ve become the sponsor/host of Hyperledger Kansas City and are working to educate the community about the viability of the blockchain framework technology, and particularly, how it could impact our enterprise clients.
Most of our clients are enterprise-level, so the two main areas that we have created actual projects with are IoT and Big Data. IoT is gaining more and more popularity as our large clients look to automate logistics and comply with ever-changing regulations. Often times these clients realize that they already have Big Data concerns just from historical transactions, but they are either archiving or partitioning those datasets away from their daily operational workflows. Once they realize what they have and what they will be potentially ingesting with new initiatives (e.g. IoT), Big Data becomes the lowest barrier to entry into a cloud-based solution.
As for the others, we hear a lot about them and clients ask for information about them, but the practical application has not materialized into projects as of yet.
Lastly, over the last couple of weeks, I’ve had some conversations with small business owners and startups who feel intimidated by AWS and all of its offerings. What advice would you impart on small business owners who find the features and benefits of AWS alluring, but feel intimidated by the unfamiliarity of the platform?
Any change is going to be challenging, even if it is for the better. The best way to mitigate the challenges is to remove the unfamiliarity.
AWS offers free accounts to get started in a sandbox fashion for clients who want to dip their toes in without putting too much skin in the game. This also allows internal resources the opportunity to experiment without too much consequence to business. After that, starting out with managed services like RDS or ECS will allow the business to get the benefits of the functionality without all of the complexity and maintenance that would be present if built from scratch.
Another option is to partner with an external firm to develop a needed portion of functionality in AWS. Then, once complete, perform an in-depth knowledge transfer showing everything that was done, reviewing landscape diagrams and design artifacts, and walking through working examples. This can greatly reduce the amount of “head first” learning and help impart best practices into the SDLC early on.
Thank you greatly for taking the time to do this interview and share your thoughts with Skeddly’s blog readers today. We truly appreciate your time. To our audience, if you’re interested in learning more about Keyhole Software, you can follow them on Twitter or head over to their website here.
Skeddly is the only all-in-one scheduling and automation service for your cloud. Only Skeddly can lower your cloud bills and manage your cloud backups in one place. Customers are happier knowing that Skeddly is working for them in the background.