Blog › Authors › Christian Beedgen

Christian Beedgen, Co-Founder & CTO

An Official Docker Image For The Sumo Logic Collector

12.11.2014 | Posted by Christian Beedgen, Co-Founder & CTO

Learning By Listening, And Doing

Over the last couple of months, we have spent a lot of time learning about Docker, the distributed application delivery platform that is taking the world by storm. We have started looking into how we can best leverage Docker for our own service. And of course, we have spent a lot of time talking to our customers. We have so far learned a lot by listening to them describe how they deal with logging in a containerized environment.

We actually have already re-blogged how Caleb, one of our customers, is Adding Sumo Logic To A Dockerized App. Our very own Dwayne Hoover has written about Four Ways to Collect Docker Logs in Sumo Logic.

Along the way, it has become obvious that it makes sense for us to provide an “official” image for the Sumo Collector. Sumo Logic exposes an easy to use HTTP API, but the vast majority of our customers are leveraging our Collector software as a trusted, production-grade data collection conduit. We are and will continue to be excited about folks building their own images for their own custom purposes. Yet, the questions we get make it clear that we should release an official Sumo Logic Collector image for use in a containerized world

Instant Gratification, With Batteries Included

A common way to integrate logging with containers is to use Syslog. This has been discussed before in various places all over the internet. If you can direct all your logs to Syslog, we now have a Sumo Logic Syslog Collector image that will get you up and running immediately:

docker run -d -p 514:514 -p 514:514/udp --name="sumo-logic-collector" sumologic/collector:latest-syslog [Access ID] [Access key]
view raw gistfile1.txt hosted with ❤ by GitHub

Started this way, the default Syslog port 514 is mapped port on the host. To test whether everything is working well, use telnet on the host:

telnet localhost 514
view raw hosted with ❤ by GitHub

Then type some text, hit return, and then CTRL-] to close the connection, and enter quit to exit telnet. After a few moments, what you type should show up in the Sumo Logic service. Use a search to find the message(s).

To test the UDP listener, on the host, use Netcat, along the lines of:

echo "I'm in ur sysloggz" | nc -v -u -w 0 localhost 514
view raw hosted with ❤ by GitHub

And again, the message should show up on the Sumo Logic end when searched for.

If you want to start a container that is configured to log to syslog and make it automatically latch on to the Collector container’s exposed port, use linking:

docker run -it --link sumo-logic-collector:sumo ubuntu /bin/bash
view raw hosted with ❤ by GitHub

From within the container, you can then talk to the Collector listening on port 514 by using the environment variables populated by the linking:

echo "I'm in ur linx" | nc -v -u -w 0 $SUMO_PORT_514_TCP_ADDR $SUMO_PORT_514_TCP_PORT
view raw hosted with ❤ by GitHub

That’s all there is to it. The image is available from Docker Hub. Setting up an Access ID/Access Key combination is described in our online help.

Composing Collector Images From Our Base Image

Following the instructions above will get you going quickly, but of course it can’t possibly cover all the various logging scenarios that we need to support. To that end, we actually started by first creating a base image. The Syslog image extends this base image. Your future images can easily extend this base image as well. Let’s take a look at what is actually going on! Here’s the Github repo:

One of the main things we set out to solve was to clarify how to allow creating an image that does not require customer credentials to be baked in. Having credentials in the image itself is obviously a bad idea! Putting them into the Dockerfile is even worse. The trick is to leverage a not-so-well documented command line switch on the Collector executable to pass the Sumo Logic Access ID and Access Key combination to the Collector. Here’s the meat of the startup script referenced in the Dockerfile:

/opt/SumoCollector/collector console -- -t -i $access_id -k $access_key -n $collector_name -s $sources_json
view raw hosted with ❤ by GitHub

The rest is really just grabbing the latest Collector Debian package and installing it on top of a base Ubuntu 14.04 system, invoking the start script, checking arguments, and so on.

As part of our continuous delivery pipeline, we are getting ready to update the Docker Hub-hosted image every time a new Collector is released. This will ensure that when you pull the image, the latest and greatest code is available.

How To Add The Batteries Yourself

The base image is intentionally kept very sparse and essentially ships with “batteries not included”. In itself, it will not lead to a working container. This is because the Sumo Logic Collector has a variety of ways to setup the actual log collection. It supports tailing files locally and remotely, as well as pulling Windows event logs locally and remotely.

Of course, it can also act as a Syslog sink. And, it can do any of this in any combination at the same time. Therefore, the Collector is either configured manually via the Sumo Logic UI, or (and this is almost always the better way), via a configuration file. The configuration file however is something that will change from use case to use case and from customer to customer. Baking it into a generic image simply makes no sense.

What we did instead is to provide a set of examples. This can be found in the same Github repository under “example”: There’s a couple of sumo-source.json example files illustrating, respectively, how to set up file collection, and how to setup Syslog UDP and Syslog TCP collection. The idea is to allow you to either take one of the example files verbatim, or as a starting point for your own sumo-sources.json. Then, you can build a custom image using our image as a base image. To make this more concrete, create a new folder and put this Dockerfile in there:

1 2 3
FROM sumologic/collector
MAINTAINER Happy Sumo Customer
ADD sumo-sources.json /etc/sumo-sources.json
view raw gistfile1.dockerfile hosted with ❤ by GitHub

Then, put a sumo-sources.json into the same folder, groomed to fit your use case. Then build the image and enjoy.

A Full Example

Using this approach, if you want to collect files from various containers, mount a directory on the host to the Sumo Logic Collector container. Then mount the same host directory to all the containers that use file logging. In each container, setup logging to log into a subdirectory of the mounted log directory. Finally, configure the Collector to just pull it all in.

The Sumo Logic Collector has for years been used across our customer base in production for pulling logs from files. More often than not, the Collector is pulling from a deep hierarchy of files on some NAS mount or equivalent. The Collector is quite adept and battle tested at dealing with file-based collection.

Let’s say the logs directory on the host is called /tmp/clogs. Before setting up the source configuration accordingly, make a new directory for the files describing the image. Call it for example sumo-file. Into this directory, put this Dockerfile:

1 2 3
FROM sumologic/collector
MAINTAINER Happy Sumo Customer
ADD sumo-sources.json /etc/sumo-sources.json
view raw gistfile1.dockerfile hosted with ❤ by GitHub

The Dockerfile extends the base image, as discussed. Next to the Dockerfile, in the same directory, there needs to be a file called sumo-sources.json which contains the configuration:

1 2 3 4 5 6 7 8 9 10 11 12 13 14
"api.version": "v1",
"sources": [
"sourceType" : "LocalFile",
"name": "localfile-collector-container",
"pathExpression": "/tmp/clogs/**",
"multilineProcessingEnabled": false,
"automaticDateParsing": true,
"forceTimeZone": false,
"category": "collector-container"
view raw gistfile1.json hosted with ❤ by GitHub

With this in place, build the image, and run it:

docker run -d -v /tmp/clogs:/tmp/clogs -d --name="sumo-logic-collector" [image name] [your Access ID] [your Access key]
view raw hosted with ❤ by GitHub

Finally, add -v /tmp/clogs:/tmp/clogs when running other containers that are configured to log to /tmp/clogs in order for the Collector to pick up the files.

Just like the ready-to-go syslog image we described in the beginning, a canonical image for file collection is available. See the source:

docker run -v /tmp/clogs:/tmp/clogs -d --name="sumo-logic-collector" sumologic/collector:latest-file [Access ID] [Access key]
view raw hosted with ❤ by GitHub

If you want to learn more about using JSON to configure sources to collect logs with the Sumo Logic Collector, there is a help page with all the options spelled out.

That’s all for today. We have more coming. Watch this space. And yes, comments are very welcome.

Christian Beedgen, Co-Founder & CTO

Shifting Into Overdrive

12.02.2014 | Posted by Christian Beedgen, Co-Founder & CTO


How Our Journey Began

Four years ago, my co-founder Kumar and I were just two guys who called coffee shops our office space. We had seen Werner Vogel’s AWS vision pitch at Stanford and imagined a world of Yottabyte scale where machine learning algorithms could make sense of it all. We dreamed of becoming the first and only native cloud analytics platform for machine generated data and next gen apps, and we dreamed that we would attract and empower customers. We imagined the day when we’d get our 500th customer. After years of troubleshooting scale limitations with on-premises enterprise software deployments, we bet our life savings that multi-tenant cloud apps could scale to the infinite data scales that were just a few years away.

Eclipsing Our First Goal

Just a few weeks ago, we added our 500th enterprise customer in just over two years since Sumo Logic’s inception. As software developers, the most gratifying part of our job is when customers use and love our software. This past month has been the most gratifying part of the journey so far as I’ve travelled around the world meeting with dozens of happy customers. At each city, I’m blown away by the impact that Sumo Logic has on our customers’ mission critical applications. Our code works, our customers love our software and our business is taking off faster than we could have imagined.  

Momentum Is Kicking In

Our gratitude for our customers only grows when we dig through the stats of what we’ve been able to build together with our world class investors and team of 170+ Sumos. Just last quarter alone, we exceeded expectations with:

  • 100%+ Quarter over Quarter ACV growth
  • 100+ new customer logos
  • 12 new 1 Terabyte/day customers
  • 1 Quadrillion new logs indexed

Dozens of new Sumos bringing badass skills from companies like Google, Atlassian, Microsoft, Akamai and even VMware…

Shifting Into Overdrive

It is still early days, and we have a tireless road of building ahead of us. Big data is approaching a $20B per year industry. And, we’re addressing machine data, which is growing 5X faster than any other segment of data. No company has built a platform for machine data that approaches our scale in the cloud:

  • 1 million events ingested per second
  • 8 petabytes scanned per day
  • 1 million queries processed per day

Today, we’re excited to share the news that Ramin Sayar will joining us to lead Sumo Logic as our new president and CEO. With 20 years of industry experience, he has a proven track record for remarkable leadership, incubating and growing significant new and emerging businesses within leading companies. He comes to us from VMWare, where he was Sr. Vice President and General Manager of the Cloud Management Business Unit. In his time at VMWare, he developed the product and business strategy and led the fastest growing business unit. He was responsible for the industry leading Cloud Management Business and Strategy, R&D, Operating P&L, Product Mgmt, Product Marketing and field/business Operations for VMware’s Cloud Mgmt offerings.

Our mission remains the same: to enable businesses to harness the power of machine data to improve their operations and deliver outstanding customer experience. With our current momentum and Ramin’s leadership, I am extremely excited about the next chapter in Sumo Logic’s journey. Please know how grateful we are to you, our customers, partners, and investors, for your belief in us and for the privilege to innovate on your behalf every day.

Christian Beedgen, Co-Founder & CTO

Meatballs And Flying Tacos Don’t Make a Cloud

10.02.2013 | Posted by Christian Beedgen, Co-Founder & CTO

Yes, we are cloud and proud. Puppies, ponies, rainbows, unicorns. We got them all. And this, too. But the cloud is not a personal choice for us at Sumo Logic. It is an imperative. An imperative to build a better product, for happier customers.

We strongly believe that if designed correctly, there is no need to fragment your product into many different pieces, each with different functional and performance characteristics that confuse decision-makers. We have built the Sumo Logic platform from the very beginning with a mindset of scalability. Sumo Logic is a service that is designed to appeal and adapt to many use cases. This explains why in just three short years we have been successful in a variety of enterprise accounts across three continents because – first and foremost – our product scales.

On the surface, scale is all about the big numbers. We got Big Data, thank you. So do our customers, and we scale to the level required by enterprise customers. Yet, scaling doesn’t mean scaling up by sizes of data sets. Scaling also means being able to scale back, to get out of the way, and provide value to everyone, including those customers that might not have terabytes of data to deal with. Our Sumo Free offering has proven that our approach to scaling is holistic – one product for everyone. No hard decisions to be made now, and no hard decisions to be made later. Just do it and get value.

Another compelling advantage of our multi-tenant, one service approach is that we can very finely adjust to the amount of data and processing required by every customer, all the time. Elasticity is key, because it enables agility. Agile is the way of business today. Why would anyone want to get themselves tied into a fixed price license, and on top of that provision large amount of compute and storage resources permanently upfront just to buy insurance for those days of the year where business spikes, or, God forbid, a black swan walks into the lobby? Sumo Logic is the cure for anti-agility in the machine data analytics space. As a customer, you get all the power you need, when you need it, without having to pay for it when you don’t.

Finally, Sumo Logic scales insight. With our recently announced anomaly detection capability, you can now rely on the army of squirrels housed in our infrastructure to generate and vet millions of hypotheses about potential problems on your behalf. Only the most highly correlated anomalies survive this rigorous process, meaning you get actionable insight into potential infrastructure issues for free.  You will notice repetitive events and be able to annotate them precisely and improve your operational processes. Even better – you will be able to share documented anomalous events with and consume them back from the Sumo Logic community. What scales to six billion humans? Sumo Logic does.

One more thing: as a cloud-native company, we have also scaled the product development process, to release more features, more improvements, and yes, more bug fixes than any incumbent vendor. Sumo Logic runs at the time of now, and new stuff rolls out on a weekly basis. Tired of waiting for a year to get issues addressed? Tired of then having to provision an IT project to just update the monitoring infrastructure? Scared of how that same issue will apply even if the vendor “hosts” the software for you? We can help.

Sumo Logic scales, along all dimensions. You like scale? Come on over.

Oh, and thanks for the date, Praveen. I’ll let you take the check.

Christian Beedgen, Co-Founder & CTO

Me at the End of the World

12.26.2012 | Posted by Christian Beedgen, Co-Founder & CTO

December 20th, 2012. Airport San Francisco. I am sitting at gate 101, waiting to board a plane to Frankfurt, as I do every year around this time. Today I will be flying on an Airbus 380, for the first time in my life. When I arrive in Frankfurt, it will already be the 21st. So, maybe I will never make it there, and the world has ended indeed. Or, maybe, being in the air while everything is going to hell is actually a smart idea, and maybe I will be one of the survivors. As you can tell, it is this time of the year – the liminal space between the years, giving room to much thought about the future, and the past. Fittingly, I am thinking about all the things that have happened last year to our little company. And it is good things that have happened, so I will face the oncoming end of the world with a defiant smirk, anticipating that 2013 will be an even more fun and engaging year, whichever world we will be living in then.

Kumar and I conceived Sumo Logic in early 2010. We started raising money, and eventually the company became a reality in May 2010. We started accepting beta customers around the same time in 2011. And in January 2012, two years after inception, we publicly launched the Sumo Logic service, and declared the company open for business. Throughout this past year however, we have seen what was just a twinkle in our eyes become a reality – a product that is solving problems for our users to such an extent that they are happily becoming customers. And customers are the most important thing for us – we live by and for our customers. The greatest satisfaction when building a product is to see it used. Ultimately we are a business, and paying customers enable us to continue our work on making them and us successful.

Then we introduced Sumo Logic Free in June 2012. With Sumo Logic Free, the entire Sumo Logic service is available to anyone for free. There’s no other limitations other than the amount of data that you can send every day. I think this marks an important step and is actually part of a bigger shift in enterprise software. Instead of hiding the actual product behind endless layers of sales process, today we proudly make the entire, un-crippled product available without any strings attached. Sumo Logic users are becoming customers knowing exactly what they are going to get. We can’t and don’t want to hide what our service is capable of. The age of Software-as-a-Service has changed a lot of things, and certainly so in the realm of enterprise software.

2012 also marks another important milestone for Sumo Logic. When we got started, we were 100% focused on building the product. Now that we are officially in the market, the non-product functions of the company are becoming more and more important. This is obviously something every technology startup is going through. We are blessed and happy to have had the chance to bring Vance Loiselle on board as our President and CEO. Not only does Vance have a stellar track record with a string of successful companies, he’s also actually kinda cool, in his own nerdy way :) – Around the same time as Vance, Mark Musselman joined us to run Sales. I am telling you, I have seen some very sad attempts at selling products in early stage companies, and Mark and all the folks he has since brought on board definitely are on a totally different level, in a class of their own, really. Reflecting on the before-and-after of our sales process over this year, I can only say that I am extremely happy, and super excited about what the next year will bring. What unifies the business and product organizations in Sumo Logic is the unadulterated belief in the superiority of our approach, and the depth of our vision. And no-one is better suited to talk about exactly this vision than Sanjay Sarathy, who some 60 days ago (he denies it, but he is counting) started as our VP of Marketing.

Finally, we ended the year with a bang, announcing the successful close of a Series C round of financing. Getting the folks from Accel to be excited about our company, and to lead the round is a great validation for what we are trying to do here at Sumo Logic. Ping and Jake have enormous and deep knowledge of the Big Data landscape, and no shortage of experience in the operational intelligence space as well. We are honored and very happy to have them on board the Starship Sumo.

And I am still sitting at the gate. My plane apparently got injured by “turbulence” on the way here. The “guys” are inspecting it, and me and like 600 more people waiting to get out of here will be “told” in an hour as to what happens next. Life is like that. Exciting, and not always a straight line. That was Sumo Logic in 2012 and we will be harderbetterfasterstronger in 2013.

Follow me at and follow Sumo Logic at

Post Scriptum: that plane got eventually cancelled, leading to massive chaos at the airport. I managed to get rebooked for the following day, via Calgary to Frankfurt. Of course, the next day comes with bad weather and tremendous delays in San Francisco. I do go to Calgary, but miss the connection due to all the delays. I am back at SFO only hours later. I finally manage to fly to Houston on standby the next morning, and take a flight to Frankfurt from there in the evening. So I do make it to Germany. My personal end of the world kicked in around 2am on Friday the 22nd, when i was sitting at the departure hall at SFO waiting to check in my bag for the 3rd time, and realizing i’ll have to wait until 4:30 in the morning, because check-ins are closed. No sleep til Frankfurt, fellas!

Christian Beedgen, Co-Founder & CTO

The Precursor Legacy

04.24.2012 | Posted by Christian Beedgen, Co-Founder & CTO

This past week has seen the long-awaited Splunk IPO turn into a reality. After nearly 10 years together at ArcSight, Kumar and I were along for the ride in 2008 when ArcSight went public. We know on a very deep level how hard it is for any company to reach this milestone. Our hats are off to Splunk for their precision in positioning and timing. The resulting positive reaction of the market is more than well deserved. Splunk is now the second public company that has bet the house on logs and unstructured data, and it clearly has managed to do something that ArcSight didn’t: to convince the world that logs are a powerful way to manage not just security, but also IT operations, and applications in general. After all, business has had its share of analytics tools. It’s time for IT to catch up — and we are now seeing this space having reached mainstream momentum and attention.

Another Song to Sing

As part of the press frenzy last week, a number of people have started to look into what’s next in this space. Big Data has many angles, and we firmly believe that logs and unstructured data are a huge part of it. Reuters published an overview along those lines. We also happened to have met with Jonah Kowall from Gartner last week. His thoughts can be found here. Both articles touch on our firmly held belief that evolution cannot and will not stop, and that in fact some of the biggest contributors to application, IT and security management problems contain the keys to tame and solve them.

Big River

It has long been established that the rate at which data is being produced is growing exponentially, and that almost all of that data is basically unstructured. Mapping this back to IT, it is clear that there will never be another unified and standardized set of protocols upon which to build the one and only management and analytics tool to rule them all. With the proliferation of deployment models in today’s highly heterogeneous environments, IT has to adapt to business needs in real-time. To accomplish this, the best and most detailed inputs are the operational logs generated in real-time by the IT infrastructure.

If I Had a Hammer

The key for the next generation of IT analytics products is to understand that any and all data must be considered as grist for the analytics mill. Relying on having to know the semantics of the data by requiring a pre-fabricated parser in order to use the data translates to keeping the door shut for some of the most detailed data. Going up the stack to the application layer, this is even more true. In order to provide more than just troubleshooting capabilities, even data that has never before been seen needs to be an input into the analytics engine. Meaningful aggregation and comprehension can be based on automatically inferring structure, and large-scale refereed structure inference will in turn lead to better semantic understanding of the data.

(There’ll be) Peace in the Valley

Ultimately, the power of any analytics is based on how much we know about the meaning of the data. Otherwise, the data is just that – data. Analytics turn data into information, and ultimately insight. We believe that the best way to accomplish this is by offering application, IT, and security management and analytics as a cloud-based service that can use the power of all the data to constantly improve analytics. Enterprises should embrace Big Data, and ask for analytics as a service, rather than trying to locally reinvent the wheel over and over again.

Christian Beedgen, Co-Founder & CTO

It’s a culture thing – Devopsdays Austin 2012

04.10.2012 | Posted by Christian Beedgen, Co-Founder & CTO

Stefan and I attended Devopsdays last week in Austin. It was a great event, and I am really glad we went — it’s always fun to be able to present your company to the public. We are very comfortable with the development and operations crowd, because it is largely at the core of what we are doing ourselves. There’s not a whole lot of abstractions to overcome! Sumo Logic sponsored the event, and so we had a little table set up in the “vendor” area. There, as well as throughout the conference, we had many interesting discussions, about our product, but also about the larger theme of the conference.

We gave away a lot of T-Shirts, and it turns out that the little Sumo toys we had initially made for the company birthday two weeks ago are a great giveaway. This is the first time we came equipped with swag, and it came across well. As topical as Log Analytics and Application Management are for the crowd attending, it’s still fun to see them all smile at little toys of big naked men!

Maybe my single most favorite moment of the entire conference was when the discussion turned to hiring. We are still struggling with a recovering economy and uncomfortably high unemployment numbers in this country, so it was notable that when the room was asked who’s hiring, pretty much all hands went up. Wow. Then somebody yelled out, “Hey, who needs a job”? And all hands went down. Not a single person in the room was looking for a job. In the words of @wickett on Twitter: “No recession in DevOps world”.

One of the things I personally find fascinating is to observe the formation of trends, communities, maybe even cultures. It is not often that one has the luck to be around when something new is getting born. I was personally lucky to be, albeit somewhat from afar, observing the early days of the Ruby On Rails community, having attended the first conference in Chicago (and then some more in the following years). Rails never really mattered in my day job, and I ultimately was just a bystander. But even so, seeing the thought process in the community evolve was extremely interesting. I feel a little bit similar about the Devops development (pun!!). I actually was attending that mythical gathering in Mountain View in 2010. But at the time, I was more worried about getting Sumo Logic company off the ground, so I actually didn’t pay attention :)

I was trying to listen in a bit more closely this time. A good overall summary of where Devops has come from — and what its main motivational forces are today — is available in a recent post by John Willis. John also presented the keynote kicking off the Austin event. This was a very interesting talk, as it was laying out the basic principles behind Devops as seen through the eyes of one of the main players in the movement.

Based on the keynote, here’s Devops in 5 keywords (buzzwords?): Culture – Lean – Automation – Measurement – Sharing. In that order. This leads to the following insight: Devops is a human problem — it’s a problem of culture, and it’s the cultural aspects that need to be addressed first, before even thinking about the other four principles. In other words, as great as tools such as Puppet, Chef, Github, and yes, Sumo Logic are, they can’t in themselves change a culture that is based on segregation. Or, simply put: as long as you have (process and cultural) walls between development and operations, operations and security, and security and development, you end up with people that say No. And that’s basically the end of agility.

And this leads to something that surprised me (I guess I am a bit late to the party, but hey): I am sensing that Devops is really about the desire on the side of the operations folks to apply the learnings of Agile Development. I consider this as a good thing. We are building more and more software that runs as a service, and so it’s pretty obvious that Agile needs to extend from the construction process into the deployment process (and along the way destroy the distinction). I do think that the Agile approach has won in the development world. It still needs to be applied properly however (see for example “Flaccid Scrum”), and I am sure overeager managers will cause more than one spectacular failure for Devops projects by misunderstanding the process/tools vs culture priorities. And since we are in 2012, Agile rears its head in one of its newer incarnations in this context: Lean – see above, right after Culture. Given that the name “Devops” is still hotly discussed, maybe we will end up with a new label before too long: LeanOps, anyone?

It was also great to see teams within larger companies making the leap – the best example is National Instruments (also the host of the event), who have managed to get more agile by adopting a Devops approach (see also this presentation). So in summary, this event was great fun. A lot of real people with real problems, applying real forward thinking. I felt the crowd was leaning more towards Ops vs Dev, but as I said above, at least in the context of the systems we are building here at Sumo Logic, this distinction has long been jettisoned.

And of course, people need tools. In all our discussions, the ability to manage and analyze the logs of production systems has stood out as a key contributor in allowing teams to troubleshoot and find the root causes of issues in the applications faster, and to manage their applications and infrastructure more proactively such that they can find and fix problems before they impact the customer.

Finally, in an act of shameless self promotion, here’s yours truly being interviewed by Barton George from Dell during the event.

Christian Beedgen, Co-Founder & CTO

Sumo Logic turns 2

03.29.2012 | Posted by Christian Beedgen, Co-Founder & CTO

SumoToday, we find ourselves in the exceptionally fortunate situation of being able to celebrate the second birthday of Sumo Logic. Companies obviously don’t get created on a single day, but from the beginning, Kumar and I always thought that March 29th of 2010 was the real beginning of the life of this company. On this day two years ago, we agreed with Asheem Chandna on the terms under which Greylock Partners would invest into our vision as part of a Series A. This really is the singular point in time at which Sumo Logic became Sumo Logic. Well, technically, it was another couple of days, as we raced to actually incorporate the company as part of the closing of the financing :) – And yes, we did get the termsheet at the Starbucks on Sand Hill Road, nervously sipping drip. Life really can work that way.

… Continue Reading

Christian Beedgen, Co-Founder & CTO

Log Data is Big Data

01.20.2012 | Posted by Christian Beedgen, Co-Founder & CTO

Nearly all of today’s most successful businesses rely on data to make smart decisions. Information technology provides the platform for processing this data to the business. It should stand to reason then that IT should be making decisions based on data just the same, in order to optimize and secure the data processing infrastructure. Welcome to the tautology club [].

The single biggest data set that IT can use for monitoring, planning, and optimization is log data. After all, logs are what the IT infrastructure generates while it is going about its business (pun intended). Log data is generally the most detailed data available for analyzing the state of the business systems, whether it be for operations, application management, or security. Best of all, the log data is being generated whether it is being collected or not. It’s free data, really. But in order to use it, some non-trivial additional infrastructure has to be put in place. And with that still, first generation log management tools did run into problems scaling to the required amount of data, even before the data explosion we have seen over the last couple of years really took off.

… Continue Reading

Christian Beedgen, Co-Founder & CTO

Log Management Challenges: So Much Pain, Not Enough Gain

11.29.2011 | Posted by Christian Beedgen, Co-Founder & CTO

True fact: unstructured data not only represents the average enterprise’s largest data set, but it’s also growing at a mind-boggling rate, which presents significant problems. Unstructured data, almost by definition, is not readily available to be analyzed.

Log management addresses a significant subset of this expanding pile of unstructured data: diagnostic and run-time log information produced by applications, servers, and devices. Think of these logs like IT’s exhaust. Since these data sets are massive and unwieldy, organizations often opt to avoid them altogether; those who do use them are typically forced to implement and support a costly legacy log management solution.

… Continue Reading