Content delivery network

Content Delivery Network

4 Simple Ways To Optimise Your Site

If you’ve invested time and money in creating an appealing website, the next step is to ensure that it gets enough traffic and visibility on search engine results pages (SERPs).

How high up you are in SERP rankings through organic search results can be greatly affected by the SEO methods you use.content delivery network

Good SEO practices help search engines to find and index your site, while also giving visitors a good user experience. This in turn will increase the likelihood of repeat visits and ultimately improve your traffic. More traffic means more conversions and more sales. In other words, by helping your online presence, good SEO will increase the success of your business.

Here are some simple ways to optimise your site and get the traffic you need to make your business a success.

Quality Content

Blogs, articles, videos and other forms of content should be informative, relevant and engaging.

Don’t mistake quantity for quality. Your content should be high quality and something that visitors want to read or watch. Knowing your audience and connecting with them through your content is one of most fundamental SEO rules.

Use keywords strategically and sparsely. Don’t overstuff your content with keywords, especially phrases that don’t fit naturally into the text.

Google is moving more towards long-form, high quality content that offers valuable information to visitors rather than mass produced, shorter pieces written for robots.

User Experience

Your website should focus on your visitors, not just your business. By creating a pleasant user experience, you’ll guarantee return visits. Your site should be easy to navigate, with relevant content, images and internal links.

Ensure your website is mobile friendly and easy to use on all devices with simple, clear CTA buttons and good loading times.

If your website is built for e-commerce, ensure the check out is simple and seamless and products can be found easily.

Speedcontent delivery network

In an increasingly impatient world, speed is an essential element for optimising your site. Pages should load quickly and videos should take seconds, even milliseconds, to download.

Don’t keep your visitors waiting; it’s a sure fire way to lose them.


Investing in a good Content Delivery Network can give your site the speed, reliability and security it needs to create an optimal user experience and improve your traffic.

A good CDN will ensure your visibility and accessibility from anywhere in the world, reaching audiences in all corners of the globe. It creates a good user experience no matter how many visitors are concurrently using your site.content delivery network

For more information, contact SynEdge today.

CDN Software

CDN Software

The Advantages Of CDN

The internet age gives business owners the opportunity to reach out on worldwide scale. Never before have businesses been so accessible. This also means that competition is higher. In the battle for higher rankings, optimal user experience has become the goal for all companies with an online presence.

The key is speed. The online world is an impatient one and that’s why a Content Delivery Network is essential for those who want to enhance their web performance and provide an excellent user experience.

CDN SoftwareHere are just some of the advantages of a CDN software for your business.

Reach Further And Quicker

Band width determines how fast data is transferred. Latency determines how long it takes to get there. A CDN can solve latency issues, cutting down on delays so that even long distance transactions work quickly and seamlessly, leaving you with happy customers, wherever they are in the world.

Reliable Load Times

Speed and latency can mean the difference between making a sale and losing customers. If your pages take more than a few seconds to load or visitors have to wait for downloads, they will move on, no matter how good your products and services are. What’s the point in spending precious time and money on a fantastic looking website, only to be thwarted by reliability issues? CDN means speed, and speed means customer satisfaction and healthy sales.


CDN software is not only used for speed and performance. It can also be used as an analytical tool to obtain valuable information. Monitoring and analysing trends and customer usage behaviour allows you to take advantage of opportunities that could ultimately help increase your sales.


CDN software gives you a single platform that works across the globe, cutting out the need for multi server providers and the complex infrastructures of expensive foreign hosting. This can save you a great deal of time and money.


With a CDN you have the added assurance that your content is protected and downloads are reliable and secure.

If you’re serious about promoting your business through your website, then reliability and speed are essential. Investing in a CDN will not only save you time and money, but will help provide users with a satisfying and pleasurable online experience, boosting sales and your company’s credibility. In this competitive age, it will keep you one step ahead of your rivals.

Make your world bigger – contact SynEdge today for more information on CDN software delivery, or visit us online to see our full range of CDN solutions.

Media Delivery CDN

Media Delivery CDN

Why Your Website Can Benefit From A Custom Video

Content marketing is essential for the success of your website and therefore, your business. However it’s not just about what you present, but how you present it.

Written content will always have the top place on most websites, but you should never underestimate the power of a custom video. In recent times custom videos have been gaining popularity as an alternative way of presenting content. Working alongside the written word, it’s an effective way of getting your message across and showcasing your business.

Here’s a look at some of the benefits of creating a custom video for your website:

Immediate Impact

Internet users are an impatient lot. They want information that’s quick and easy to digest. A video gets the message across immediately. It’s a great way of grabbing attention. Don’t disregard written content though. Once you’ve created an interest through your video, many visitors will be happy to read on for more information.

A More Personal Approach

A video gives visitors a human aspect and helps you connect ‘face-to-face’ with your visitors. Sound and motion appeals more to the senses and by visitors actually viewing a real person, it helps build up their trust in your product and your business. A video is a great way to promote your brand. It also uses a ‘show, don’t tell’ way of marketing your products. By using a video you can actually demonstrate how to use certain products rather than just describing them.

Media Delivery CDNGoogle Loves Videos

A custom video can help create higher conversion rates and improve your SEO. ‘Dwell time’: how long a visitor spends on your sight, is used when it comes to Google rankings. This is a combination of session duration, bounce rates and click throughs. An engaging video encourages visitors to stick around, minimising your bounce rates and increasing your click throughs on your SERPS.

A Step Ahead

A higher percentage of consumers buy a product with the help of a video and real estate sites with videos have far higher enquiries than those without.  Yet many business websites still don’t use custom videos. It’s a great opportunity for you to use an innovative and creative marketing tool to keep one step ahead of the competition.


Media delivery CDN ensures your custom videos have super fast download times, perform well, are reliable, secure and mobile friendly. Analytics engineers can also monitor visitor behaviour so you can maximise opportunities for increased revenue.

For more information on our media delivery CDN solutions, as well as our private CDN solutions, contact Synedge today, or visit us online to view our full range of CDN solutions.

CDN Solutions

The SynEdge CDN Solutions

Using a Content Delivery Network (CDN) is crucial to having a global business – if your website is reaching people across the world then you need to ensure that they’re able to access your content reliably and with low latency.

In regards to the internet, latency is time it takes to get information from the server to the user. A CDNs’ primary purpose is to decrease latency, keeping your end users happy.

CDN solutions are offered to help business’ meet specific needs. We offer specific solutions for Gaming, e-Commerce, software and media delivery, and private CDNs.

Gaming Solutions

Our gaming solutions offer a range of features for game delivery that have been developed to address the specific challenges of distributing online games to players all over the world. Gaming is highly competitive and if your users aren’t able to access your games or download them quickly, they will find someone else where they can.

Ecommerce Solutions

Our e-Commerce solutions are centred around the following:

  • Full, secure transactions for your user base
  • Geo-control to ensure that online products are available to relevant audiences
  • Advanced audience analytics that offer you the opportunity to grow with transaction trends
  • Improved discoverability that will benefit your SEO strategy

e-Commerce sites tend to host a lot of static content (images) which make web pages ‘heavy’ and affect loading times, especially when the information is being retrieved from the other side of the world.

CDN SolutionsSoftware Delivery Solutions

It’s important that your large downloads and software patches are delivered efficiently. For this reason, we offer software delivery solutions with the lowest error rate and highest throughput in the industry. All of our technology is cutting edge and at the forefront of the industry, resulting in less moving parts. This also reduces the footprint produced, especially in comparison to older, legacy CDNs.

Media Delivery

Customers have specific media demands. Media delivery CDN solutions help you meet those demands. The most important of which is streaming video; customers want the fast video streaming that our media delivery CDN solutions provide. Abandonment rates for slow loading video is very high because consumers expect near-instant loading times.

Private CDNs

Sometimes, commercial CDN solutions aren’t enough to get the job done. In this situation, a private CDN solution is needed for larger control over your delivery. Our Private CDN solutions give you dedicated PoPs in locations of your choosing that can also be white labelled as your own.

SynEdge has been designed with you in mind; the team has the experience and knowledge needed to help you with CDNs. Our platform offers a transparent, intelligent, and cost effective way to deliver your content. For more information on our CDN solutions, contact SynEdge today, or alternatively, visit us online for more help. We’ll gladly be of assistance.

Turning Data into Information: Part Five

This is a series on the technical background of getting information out of the CDN. Part five is about what is there to come; the information will we add and what challenges this will pose for us. A glimpse into the future.

Currently we only support data which can produce graphs. This is great for a lot of metrics but some information is better presented in tables. One of the planned improvements is to extend the query language to also fetch tabular data.

Another new feature we are also looking into the error codes. We now offer error codes grouped by 2xx, 4xx and 5xx range. We would like to allow users to see the individual error codes as well.

Our log files contain a lot more valuable information, like the location of the user requesting the file for example. While a lot of our data so far has relatively few possible values, like data centres or protocol, user location adds hundreds. This can be a challenge in querying and displaying the data.

Another problem is that all our data is currently viewed per distribution group, protocol, region and datacenter. Adding new views to this is something we want. One of the opportunities is to make it possible to see error codes per user location, or cache hit versus a specific error code.

With all these changes it is likely we’ll stop using Graphite as we do today. It would need to create too many files which would take up a lot of space and a lot of duplicate data. Also storing tabular data in Graphite is not possible.

This gives us a dilemma. We want to make as much data available to our customers but at the same time we do not want to change our API, as this would mean our current users would have to invest time to change their code. Currently it is our goal to do all these changes without breaking the current interface. Having a custom query language will help us with this.

It’s going to be an interesting time for our development team as we start to work on this.

I hope you liked this short series with a bit of technical background. If you want to know more about it, feel free to leave a comment, or visit our contact page and get in touch!

Paul van Assen, Head of Development.

Turning Data into Information: Part Four

This is a series on the technical background of getting information out of the CDN.

Part four of the series is all about storage and visualisation. How can we store the vast amounts of data in an efficient way and how are we presenting it to the user?

In our previous blog about statistics we mentioned we’re using Graphite for storing statistics. Graphite currently is our metrics storage platform. Graphite itself is made up of three components: Carbon, which receives statistics, caches them and writes them to the database. Whisper is the second component; this is the actual data storage engine. Lastly there is the Graphite web application. This web application retrieves data from Whisper and renders graphs or outputs the numbers.

The Whisper storage engine stores data per metric. We can access this data through the Graphite web application by specifying one or more metrics. We can also apply simple calculations on the data before it is returned. This provides the basis for our dashboard.

In order to prevent a vendor lock-in, where we are bound to one specific implementation of fetching and presenting data, we build a small layer between our API and Graphite. This layer consists of a modified way of sending data to the dashboard and a query language. One such difference is Graphite stores and sends data in seconds, where our whole API works in milliseconds.

The query language we are using to fetch data is heavily based on Graphite’s own query language. If you know how to pull data from Graphite, it’s very easy to fetch data through our own API. We translate the query received from the API into an object tree. From this object tree we can construct Graphite queries, but also MySQL queries. When we receive the data we can do post processing on the numbers to support features not included in Graphite, like weighted average over multiple targets.

All this comes together in our dashboard. On our homepage you will be greeted with most important statistics of the current month. But if you want to dive in deeper, you can go to the analysis tab. In here you can build your own query, grouping and summing metrics, applying transformations in the time space like averages per hour and select more metrics than are visible on the homepage.

If all this is not enough, you can also use the API to send your queries straight to our servers. They will return you the raw data ready to use in your own systems. It’s our goal to make as much data available to our customers. We are constantly working on improving the systems, making it easier to access data and making it possible to add new metrics in the future. A sneak peek into the future will be the final post in this series.

Turning Data into Information: Part Three

This is a series on the technical background of getting information out of the CDN. Part three is a technical article about how we convert data which is distributed across multiple locations into information.

In our previous blog post, we explained how we fetch all log lines and feed them into Kafka. In all of our current PoPs we are running Apache Spark. Spark was initially designed to do batch processing on large chunks of data. Later on, Spark Streaming was added.

With Spark Streaming it is possible to do batch operations via streaming. In our setup, Spark Streaming buffers all data from the Kafka message queue for 15 seconds. Once it has 15 seconds worth of data, it initiates the processing of this batch. By buffering for 15 seconds we benefit from having very little overhead of starting a task while still having metrics available near-real time.

The way the metrics are set up, there is no dependency between data from different PoPs. This means we can calculate the metrics for one PoP without having to wait for calculations done somewhere else. Also because of this we can process the same log line in two different locations, which should give us the same result. This allows us to achieve redundancy of our calculations by sending each log line to two different PoPs.

In a perfect world, all log lines would flow into Spark in a predefined order, begin old to new. No log line would ever enter the queue out of order and we only have to do the calculations for the past 5 minutes. In the real world, it’s something quite different. That’s why we store intermediate results of all our calculations for at least two days. This allows us to get the correct data even if a log line arrives way too late.

This however is an exceptional case. During normal operations 99% of the log lines arrive within 15 minutes of hitting an edge node, 95% within 1 minute. This means we can provide a very good view of what is happening on the CDN in near real time.

Currently all statistics generated are in the form of a number at a specific timestamp. In order to store this data effectively we are using Graphite. This tool specialises in storing just timestamp plus value data. It also provides a way to adding up multiple metrics or to calculate averages over time.

All metrics you see on our dashboard are from our Graphite backend. From our analytics page you can dive into the numbers to get out information that is relevant for your business.

Turning Data into Information: Part Two

This post is the second in our series on the technical background of getting information out of the CDN.

Part two provides a technical background on how we gather log entries from different sources, how we convert them into a generic format and feed them into a distributed system.

For reporting, our smallest unit is a datacenter, or point of presence (PoP). In our current generation of PoPs, we are processing the log lines inside the PoP. Because a PoP is a collection of multiple edge nodes, we need to gather all of the log entries. For our processing the smallest window is 15 seconds. We collect all log entries from all edge nodes and buffer them for 15 seconds before processing them.

On the edge nodes we are receiving the log lines from our Nginx webservers and in a non-blocking way we are feeding these lines to a local Kafka cluster. Apache Kafka is a distributed message queue. To us, every log line is a message that we need to act on. The Kafka cluster makes sure messages are stored while awaiting processing and the processing happens only once.

In order to achieve redundancy, we also send a copy of the log line to one of our other PoPs. Which PoP it goes to is fixed upon beforehand.

Our legacy PoPs on the other hand do not have the capacity to store and process all log lines locally. We gather all log lines from our legacy PoPs in Amsterdam. This goes through FTP or HTTP, depending on the PoP. Our legacy PoPs are generating a slightly different format of logging. In order to keep data processing simple, and raw logs the same for everyone, we transform the log lines from the legacy format into the current format upon download.

Log lines from our legacy PoPs are then sent into two Kafka clusters, in to two different PoPs. It is also predetermined here which PoP’s log lines are sent to.

All these log lines are adding up to a lot of data. In the current setup it adds up to twice as much data since every log line is sent to two PoPs. In order to save bandwidth, disk space, but also IO operations, we sacrifice some CPU power to apply Gzip compression. Even with minimal compression we are able to cut the size by around 60%.

If we could keep all the log lines around, even compressed, we’d run out of hard disk space quite fast. Kafka, the distributed message queue is keeping track of which log line has been processed and which ones haven’t. Once a log line has been processed it is marked for deletion. This way Kafka keeps the disk space used as small as possible.

With this setup we have a steady flow of log lines flowing into our data processing cluster. This cluster does the heavy lifting of calculating the metrics for each of our PoPs, turning data into information.

Turning Data into Information: Part One

This post is part one in a series on the technical background of gaining information from our CDN written by our Head of Development. Part one is an introduction to the global infrastructure required to turn data into information.

For anyone using a CDN, performance is key. To keep the performance optimal, our technical team needs insight into what is happening on the CDN. A major part of this insight is provided through the data generated by people requesting files.

When an end-user retrieves a file from our CDN, the browser fetches a file from our servers, called edge-nodes. An edge-node will serve the file to the browser, either from cache or from the origin.

Every time a browser requests a file a log line is generated. This log line contains several key metrics for measuring performance. It will contain the size of the file requested and how long it took to transfer the file. A log line will also contain how much data was actually transferred and where the user came from. With these metrics we can optimise our network to send end users to the fastest edge node. We can also advise the builder of the website to optimize their requests, reducing overhead.

When a log line is generated we process this into usable information. For most information, speed matters. The sooner you have access to information, the more valuable the information is. We want to have all information coming from the edge nodes available as soon as possible so we can act on any issues before anyone else can spot them.

With a constant flow of requests hitting our edge nodes it can be quite a challenge converting all this data into information, especially if you want to get the information as fast as possible.

The traditional way of data processing is what is known as batch processing. You wait for a while before gathering all data, in our case log lines. When everything has been gathered, the processing phase begins. Once processing is done, the end result is ready. This way of processing often happens once per hour or even once a day. In the case of a CDN, this delay would take away most of the value the information has.

We have chosen an approach where we continuously feed the data into our cluster which is processing almost as it arrives. This generates a continous, near real-time flow of information. This information is presented in the form of line charts.

In our Eindhoven office we continuously display some of these line charts to monitor the system. We also provide all the charts to our customers. They can be found in the dashboard, under the Analytics tab. In there you can build your own charts, or use our API to fetch the data straight from our servers.

on demand CDN

The Benefits of Using a Private Content Delivery Network

A rising number of content owners and publishers have found that they require larger control over their delivery than that of classic, commercial CDN. A Private CDN is often the answer as it consists of a set of geographically distributed servers working in clusters located in the points of presence (PoPs) of your choice similar to the structure of a traditional CDN. However, the infrastructure is completely dedicated and configured to serve your traffic, whether it’s customer facing or internal to the company. The SynEdge CDN will only pull from your origin servers, will only host your SSL keys and will serve all the IP addresses you own and only those.

Our CDN can be deployed in both data centres and in the field at the outer edges in radio towers. It is optimised for all HTTP adaptive streaming technologies and for delivery to 3G/4G devices. Moreover, SynEdge supports Flash streaming including all other codecs, protocols and formats so that any device on any network can be reached.

Using a private CDN with SynEdge comes with the following benefits:

  • A bespoke service based directly on your needs.
  • Edge resources are not shared with other users, so you’ll have a network of PoP’s devoted to you requirements and traffic.
  • Full customisation and control, for example restrictions based on territory and white/blacklisting users.
  • White label solution to be branded as your own.
  • Freedom of choice of where to have the PoP’s and the option to target future growth areas.
  • Potential additional revenue stream.

If you have any requirements for a private CDN or are unsure as to whether a private CDN is the best option for you, a member of our dedicated team will happily assist you in choosing the right CDN product. Email us at: or call us on 01344 706 061.