Cloud Computing Explained To Dummies

Cloud computing, or “cloud computing”, is everywhere. But this computer concept is still vague for the general public. Catch up session.

The phenomenon of “cloud computing” is becoming more and more widespread in companies and the general public is starting to know it in the form of remote storage services, or streaming music. The cloud is on everyone’s lips.

Bernard Ourghanlian is technical director and security of Microsoft France. For L’Express, he explains what the cloud is, and what he point.

How do you define cloud computing?

The general public has long used cloud computing without knowing it. When you use your webmail, Hotmail, Gmail or other, we make the cloud. Cloud computing is accessing computer resources that are somewhere through the internet. It can be accessed for free, as is the case with the webmail. On subscription, with a guaranteed level of service, Companies buy capacity, and are billed a little like water, gas or electricity: we pay for consumption. Like the electric current, you can consume as much as you want. Virtually, the power is infinite.

Can we say that cloud computing is a revolution for IT?

It is an economic revolution, but not technological. Because of it is based on long established technologies.

What is the interest of cloud computing?

It is mostly economic. If you are a very small company, you can launch a service without any capital investment in hardware. Thus, virtually no software start-up is investing in heavy equipment today. The second advantage is to be able to benefit from economies of scale that have economic repercussions. For example, IT resources that are not used by French companies at night, are by companies on the other side of the planet. It’s like a 24-hour factory with shared resources. This allows, for example, businesses such as e-merchants, who have a peak load at Christmas and much less charge the rest of the year, to have the necessary resources during this peak without investing in capabilities they would not use the rest of the year. That’s how Amazon started cloud services initially. They had invested in huge capacities and sought to amortize them.

What allows us to do the cloud, which we could not do before?

For example, preserve the context when changing terminals. Example: You start an Xbox Live game on your Xbox console, and you continue to play on your mobile phone.

Where are these computer resources, which are no longer local? Are some of them in France?

They may be “in the cloud”, they are on Earth, in giant datacenters that are a dozen times the size of a football field, and are filled with machines. At Microsoft, two datacenters are currently under construction. We have one in Europe, in Dublin, which has cost $ 500 million, and we have a backup in Amsterdam, which we rent to hosts. The further a data center is from the user, the longer the response time. We know how to increase the bandwidth but not decrease the latency of requests. For the general public, basically only couriers are hosted in France.

The problem is that data is held by a small number of companies, and access is based on maintaining an internet connection.

The big players in the cloud are indeed few. Google, Amazon, Microsoft, Salesforce. For more security, for businesses, there are private clouds, which are not shared. The goal is to have the same benefits, but to keep control of the environment. This type of service is offered by many actors, such as hosting companies, IBM …

As far as the general public is concerned, is it possible to know where the stored data are?

The average Internet user can not know. These data can also move. No commitment is made at this level vis-à-vis the general public.

Can the general public also lock their data in a private area, a digital safe?

This kind of service exists. To make sure that no one can access it, including the provider, you have to encrypt the data. Thus, only the user has the keys of his safe. As long as you want to store private data, it works. But copies of keys on a large scale, today we do not know how to do. If you want to share and collaborate, allow a large number of people to search in a field of encrypted data (in business for example), it is not possible because the response time becomes a lot of diving. It’s the grail of today’s cloud research.

The Prism Of The Clouds

When the cloud of keywords swept over the main blogging platforms a few years ago, the craze they aroused could hardly be explained by a simple rational explanation. There is ancestor modern data visualizations (which always use it) something that bypasses the brain to speak directly to the visual cortex. The cloud of keywords charms and seduces more than it actually serves.

Why does the keyword cloud please as much? I will try to make an explanation after detailing the concept and presenting our own version.

The prism of the clouds

In fact, there are several technologies that are brought together under the term “keyword clouds”. The translation of the expression is also unsatisfactory because “key word” reflects only one of the meanings of “tag”, which is at the same time the label, the marker, the legend … to the point that certain disciplines (like computer science) use the English word more often, even in French.

The “historical channel” keyword cloud is therefore a representation of the keywords present on a site, which supposes that the site has keywords associated with each page or article. In a typical configuration, each article in a blog is associated with a number of keywords that are added manually by the user. Some will be recurrent from one article to another, others more rare. The cloud of keywords makes it possible to instantly see these recurrences, and in principle to identify at a glance the themes of the blog in question.

It is important to note that it is the author of the content that informs the keywords, and therefore puts certain intelligence, or at least a human look. There is indeed an automated version of this cloud that does not require human intervention, where the machine scans the contents and counts the recurrences.

The man and the machine

The difference is significant because we move from a human organization, subjective, to a purely statistical organization as found in a lexicographic analysis. In this version, only the number of occurrences of a word counts, regardless of its semantic importance, hence the often very disappointing character of the software that generates them. We will typically observe:

The predominance of “stop words”, these little words like “the”, “to”, “and”, which do not carry meaning by themselves;

The disappearance of phrases (or expressions) composed of several words, such as “cloud tag”, which is divided into “tag” and “cloud”;

A certain volume of noise due to the consideration of the entire site, including menus, buttons, and others “Leave a comment”.

The result is usually something useless, overloaded, and that brings little or no information about the contents “synthesized” unless you already have an intuition or knowledge of these contents, and to be able to read between the lines of the expression cloud.

This technology may have been as dreaming as the first version of the keyword cloud, which requires significant human work prior. Was it hoped that the machine would replace the human? I believe that the expectations have been disappointed overall but that the hope of instantly understanding and synthesizing a set of text is ambitious enough to keep keywords clouds alive despite the disappointments …

From lexico ‘to termino’

We briefly talked about lexicographic analysis, saying it was a statistical process. Specifically, lexicographic analysis software will generally produce a list of all words used in a text and associate a number of occurrences. This “blind” approach can sometimes reveal interesting elements, but is a priori more for an informed analyst. This one will be able, by methodology and experiment, to interpret these statistical results, for example by observing that an insignificant term like “but” is overrepresented compared to the average. Without knowledge of this kind of information, it is difficult and especially risky to “talk” this kind of cloud keywords.

With a little bit of semantics, you get much more interesting results, by doing what’s called a terminological extraction. The difference is, to put it simply, that we go from word to word. A term can be composed of several words and corresponding in principle to a concept. For example “potato” is a term that has a different meaning of “apple” as “earth”. The software will count not only occurrences of a term, but co-occurrences, such as “apple” and “earth” in our example.

5 Personal Cloud Solutions At Less Than 200 €

If you start worrying about the amount of your iCloud or Google Drive bills and you do not know how to set up a NAS server, do not panic, there are solutions to create a personal cloud, hosted at home or at a professional, but accessible from any machine connected to the internet. explanations

Long reserved for professionals, cloud computing or “cloud computing” solutions are now accessible to the general public through services offered by the big names of the smartphone (Apple iCloud, Google Drive, Microsoft Onedrive), by companies Specialized (Box.net, Dropbox, Hubic / OVH, Gandi, …) or by your operator (Orange Cloud, SFR Cloud, etc …). But these solutions, billed a few euros per month, for a few dozen GB, several tens of euros, for the most generous, can weigh heavily on the bill and it may seem better to create its own cloud solution, by example hosted on a home server.

 The cheapest: the Freebox

It is sometimes forgotten but for a few years Free has been offering a storage space of 250 GB in its Freebox “server”. Once activated, this space can accommodate personal files (photos, videos) that will be shared on your home network but also accessible remotely provided you have made the necessary configurations. The solution is free if you are already subscribed to Free but capped at 250 GB and remain difficult to access from a mobile terminal. To find out more, just go to this

The most classic: My Cloud

As a hard disk champion, Western Digital has also been offering for the last few years its MyCloud range, a connected hard disk, offering for just 160 euros no less than 2 TB of storage. Easy to connect to your box via an ethernet cable, this type of hard drive is mainly used for backing up your computers or smartphones (only photos and videos via Wifi), but also offers remote access to the drive away from home . In addition to storage, WD worked on sharing, with the ability to create different user accounts that can connect to subfolders of the disk. The solution works well even if we are far from the simplicity of synchronization of public cloud providers

The most ambitious: Cozy

Developed by two French, Cozy is an opensource software designed to create its own personal cloud. Rather ambitious since it makes it possible to synchronize contacts, diaries and files between its peripherals and its server, it imposes to install the software Cozy Cloud on a real server and not on a simple hard disk. Cozy apparently discusses with OVH, to propose turnkey servers (for the moment free) having his solution but the more handymen can try to install it on a domestic server like the Rasperry Pi (40 euros) to which it will be necessary to add a storage solution type SD card or external hard drive. To test by downloading the software here

The cheapest: Lima

After three years of R & D and a successful campaign on Kickstarter, the small box Lima will be available for sale on November 25 for 99 euros. Linked to an internet box, Lima can then turn any hard drive into a “connected” hard drive, which allows access to its files from any other computer or from a simple smartphone. All exchanges are encrypted, Lima does not maintain a priori data on its servers and just change the hard drive to increase the capabilities of its “cloud”. On sale from November 25th on

The most secure: Helixee

Also popular on Kickstarter, the startup Novathings also begins marketing Helixee, an even more integrated personal cloud solution than Lima since it includes not only a 1 TB hard drive but also a much more advanced sharing application with a kind from newsfeed. An interface, inspired by Facebook, for example to track files shared by the same members of a family. Fully secured, the case is also complemented by applications for smartphones (iOS and Android), simply retrieving updates (photos, contacts, etc …) of the mobile terminal. The 350 backers having pre-ordered the product will be delivered in the coming weeks. For the others; it will be necessary to order the box on November 23, for 199 euros, and wait for a delivery in June 2016

Sold less than 200 euros, these largely French solutions (freebox, cozy, lima, helixee) will allow you to create a personal cloud without having to assume a monthly fee. The most hackers will also be able to turn to real networked storage / NAS servers (Synology, QNAP, Seagate, …) but the prices quickly break the threshold of 200 euros for the most advanced solutions.

5 Formidable Threats To Cloud Security

Despite the progress made in the field of cloud computing, there are still many issues related to cloud security. By adopting certain practices, the cloud remains the most reliable IT option for businesses. Here are the best software and practices for cloud security.

Data leaks

Absolute security does not exist. Data leaks can occur within virtual machines, and personal information from customers or corporate confidential data can be stolen. It is possible for a virtual machine user to listen to the activity signaling the arrival of an encryption key on another VM of the same host. So far, however, this technique has never been used for major data leaks.

Nevertheless, this prospect discourages many companies from adopting cloud computing. Unfortunately, measures to prevent leakage or theft of data can exacerbate these threats. For example, encryption protects data from theft, but the loss of the encryption key results in an irreparable loss of data. Similarly, regular copies can prevent the loss of data, but expose them to theft.

Data loss

A data leak is usually the result of an outside attack. On the other hand, data loss can occur when a hard disk stops working and the user has not made a backup, or when an encryption key is lost. There are methods to avoid these losses, but none are foolproof.

Account theft

Phishing, exploiting software vulnerabilities or password losses can lead to cloud account theft. An intruder who takes control of an account can manipulate the data as it sees fit, interact with customers or send them to competing sites.

Hackers may also seize an entire service and compromise its confidentiality, integrity or availability. There are several ways to avoid this type of inconvenience. The best method is to prohibit the sharing of credentials between users, including trusted business partners, and to implement two-factor authentication techniques.

Unsafe APIs

To prevent anonymous users from attacking cloud services, a public API has been put in place to define how third parties connect an application to a service and verify the identity of that third party. Leading web developers, including those from Twitter and Google, have collaborated to create OAuth, an open authorization service for web services to control third-party access.

OAuth became a standard Internet Engineering Task Force in 2010, and Version 2.0 is used by some Google services, Twitter, Facebook and Microsoft. However, there is no perfectly secure public API. Depend on OAuth may expose a company to security issues related to the confidentiality, integrity, or availability of its services.

Denial of service

DDOS attacks involve sending millions of automated queries to a service to saturate it. In addition, for a cloud service, the company can receive an astronomical bill for resources used during the attack. These attacks are becoming more sophisticated and difficult to detect before it is too late.

The ill-intentioned employees

A company is never safe from a malicious employee, ready to steal unscrupulous data from within. To avoid this disaster, the best course of action is to keep the encryption keys on a physical storage medium and not on the cloud. Firms who put their security back in the hands of a cloud provider are exposed to increased risk.

The abuse of cloud services

Cracking an encryption key with limited hardware can take years. However, hackers also have access to cloud services, and can use cloud servers to crack these keys in minutes. They can also use these servers to launch malware, DDoS attacks, or to distribute pirated software.

Cloud service providers are responsible for avoiding such abuses, but it is difficult to detect inappropriate uses. Be that as it may, companies need to check how a cloud service provider responds to such abuse before choosing it as a partner.

The lack of precautions

Many companies go into the cloud without really understanding what that decision entails. If they do not fully understand the offer from a vendor, they do not know what to expect in the event of an incident, encryption, and monitoring. The firm is therefore exposed to increased risks.

Shared technologies

Attacks on shared cloud infrastructures compromise more than the attacked client. The entire company is exposed to data leaks. It is recommended to adopt a thorough defensive strategy and surveillance measures.

5 tips to ensure cloud security

Adopt a secure-by-design approach

Companies must learn to identify controls that provide direct access to information. Adopting a so-called secure-by-design approach from the outset makes it possible to meet security needs. Such an approach also facilitates the implementation of cloud resiliency and auditing solutions.

Determine alternative deployment locations to redeploy images to quickly

It’s important to determine alternative deployment environments, and make sure to select a cloud vendor that does not impose onerous conditions to be able to change them when needed. By remaining flexible, the company ensures that it can react to changing conditions without interrupting its activity.

Cloud Screener Builds On Its Consulting Business To Grow

The French start-up Cloud screener specialized in comparing and measuring the quality of cloud offers has developed a consulting activity. By the end of the year the latter should represent 40% of the company’s business knowing that recruitment will come to strengthen it in 2019.

The French company Cloud Screener led by Anthony Sollinger is getting stronger in cloud computing consulting. Credit. D.R

Offering a cloud comparison service for the past 5 years – and since 2016, as far as the quality of this type of offer is concerned – the French start-up Cloud screener has been expanding its business into consulting to develop better. “When we created the company, we were not in the business of consulting companies, but only proposing tools,” said Anthony Sollinger, co-founder and CEO of Cloud screener. “We managed to launch a consulting offer by recruiting 5 dedicated consultants, deployed to customers”.

Far from being alone in the cloud consulting market (Capgemini, Atos, Devoteam …), Cloud screener is putting some strengths to make the difference. “We have implemented a proven methodology designed with Inria on different clouds,” continues Anthony Sollinger. “We are able to run performance tests on machines, look at virtual server behavior, processor speed, RAM, run multiple tests and track them over time.” Among the first customers of Cloud screener’s consulting offer, for example, LVMH is “to see more clearly in their private cloud, analyze the results and compliance of what is delivered by their suppliers to have a better understanding of internal operations and carry out the necessary corrective actions, “says Anthony Sollinger. Among the historical customers of the cloud offering is DINSIC, which runs Cloud screener benchmarks on their Cloudwatt-based public cloud infrastructure to verify that the provider’s quality of service commitments are compliant and consistent  required.

Already 12 customers for cloud consulting missions

“We come with expertise based on profiles of consultants with a technical background and the ability to go to the customer. They are more like juniors with 3 years of experience, “says Anthony Sollinger. The goal of Cloud screener is to make its consulting business one of the pillars of its business. While last year represented a quarter of its business, it is expected to account for 40% of the company’s revenue by the end of the year. Recruitment followed and will continue in 2019 with a projected increase in teams of 50%, bringing the total workforce to a little more than fifteen. “The idea is to make recurring sales with the board,” predicts Antony Sollinger. This seems well started: of the 60 current customers (Docapost, GRDF, Unibail …), 12 have started or already carried out cloud consulting missions with the French start-up.

More And More Offers Available

Cloud computing brings together various families of uses: the exploitation of online software (Google Gmail or Google Apps, Microsoft Office 365, Oracle Cloud Office, Orange Cloud Computing, SFR Business Team, etc.), archiving data, computing power or development environments, and collaboration through shared workspaces and synchronous communication tools (available from Amazon, Google, or Salesforce.com). All fields of activity are now concerned, even if the CRM, the collaborative applications and the storage of data arouse even more the interest of the French companies. “Two or three years ago, data storage represented the bulk of cloud offerings. Today, almost all service offerings are available: CRM software, accounting / ERP solutions, reporting and office management solutions “, observes Christophe Suffys, CEO of Bittle, publisher of cloud reporting solutions.

Going towards a cloud offer, is to answer this question: “Is it better to buy or rent the software and servers of your company?” In the second case, only what is used is paid, principle same cloud. This flexibility of use allows a fast response if the company knows a strong growth. “The flexibility of the cloud model allows managers to anticipate peaks or decreases in activity,” says Hélène Caraux (OVH). “Thanks to user fees, it’s easy to size the scope of your needs,” says Noël Minard, CEO and partner of A2com Resadia (IT editor-trainer). Flexibility is comes with speed of implementation. When acquiring traditional applications, it is common to turn to your IT department, make a procurement request, buy the potential servers, set them up, and then contact the publisher to install the solutions.

In cloud mode, server provisioning and software package installation can be done in a matter of hours. In addition, updates are no longer the responsibility of the user, and load escalations are automated. Similarly, the cloud makes it possible to dispense with the deployment of servers (with different brands) dedicated to each application: CRM, ERP, etc., thus avoiding the explosion of resources, difficult to manage. The question of cost is also a decisive point when moving to cloud computing. In addition to user fees, experts agree that the financial gain is between 20 and 40%. An economy can be even more important. Take the case of a messaging system for a dozen people. Between the purchased of the software, the costs related to the support and the maintenance, it is necessary to count about 5000 Euros HT per year, in license mode. In the cloud, the equivalent is estimated at 750 Euros HT per year.

7 Monitoring Solutions For The Public Cloud

AppDynamics, Datadog, Dynatrace, New Relic … A growing number of publishers offer tools to monitor the performance of applications in public cloud mode.

While their information systems have become highly clouded in recent years, businesses need to be able to evaluate the performance of their cloud providers as they have always done with their own infrastructure. The question is even more crucial when it comes to applications hosted in the public cloud. The degradation of the quality of an IaaS or PaaS service or even its unavailability can indeed have very damaging consequences on the activity.

Fortunately, there are a multitude of open source or proprietary monitoring tools to reduce the risk of failure. Here is a selection.

AppDynamics: metrology at the service of business

Acquired last January by Cisco for $ 3.7 billion, AppDynamics is another heavyweight of the APM. Its cloud-based Cloud Application Performance Monitoring offers real-time control of applications deployed on the main IaaS and PaaS platforms, including Amazon Web Services, IBM, Microsoft Azure and Pivotal Cloud. It is cut to describe as visually as possible their performance throughout the computer chain, from the execution of the code to the infrastructure. Via its new App iQ tool, the solution also measures the impact of these performances on the company’s activity (particularly on the volume of transactions). A business position aims to exceed the APM segment, targeting primarily the ISD, to also address the business areas. Rates established on estimate.

Cloudwatch, the watchdog for AWS services

With Cloudwatch, Amazon Web Services has a monitoring service dedicated to applications running at home and related resources (AWS EC2 instances, DynamoDB tables …). It can collect and track metrics, such as latency, but also trigger alarms, manage logs or build dashboards. What is the strength of Cloudwatch is its integration into the ecosystem of Amazon. When a specific EC2 instance exceeds a predefined threshold, for example, it dynamically allows you to delete that instance or, on the contrary, to add a new one, by soliciting AWS auto-sizing mechanisms. A strength that is also its weakness, Cloudwatch remaining confined to a single-cloud approach. As for its fee schedule, it is also a model of complexity. The customer is charged per month based on the number of dashboards used ($ 3 per dashboard), alarms ($ 0.10), queries ($ 0.01 per 1000 requests) and GB manipulated logs ($ 0.50 per GB).

Datadog: natively cloud monitoring

Founded in 2010 by two French people, Datadog is based in the United States where it generates most of its turnover. It includes eBay, Intel and Stripe among its references. Its solution is natively tailored to oversee applications in the cloud environment. It relies on more than 200 connectors to integrate the various services of public cloud providers (AWS, Azure, Google Cloud Platform …), the most popular SaaS applications (Zendesk, Slack, Jira …) or big data databases and infrastructures (Hadoop, Kafka, Spark …). The product also supports virtual machines and containers. The metrics feedback allows to build interactive dashboards in real time. Datadog recently acquired Logmatic.io, a Parisian start-up specializing in application log management. The event analysis will thus complete the APM bricks and the monitoring of its offer. Datadog offers a “pro” version at $ 15 per host per month and a $ 23 “enterprise” version with extended features and premium support.

Dynatrace: full stack monitoring

Acquired in 2011 by Compuware and then by the Thoma Bravo investment fund in 2014, Dynatrace is a historical player in the APM. Repositioned in “digital performance management”, its solution aims to offer a full IT monitoring (or full stack). It covers the entire application chain by monitoring the application itself as well as metrics from “real users”, internal infrastructure and cloud environments. It also supports Docker containers. Dynatrace intends to use artificial intelligence to automate the management of this monitoring and prevent performance problems. His connection with AWS is particularly strong. Since mid-November, Dynatrace has been present in the US cloud marketplace. The publisher can also pilot the monitoring of the new VMware Cloud on AWS hybrid cloud offer – available for the time being only in the United States. In “pay as you go” mode, rates start at $ 0.035 per host per hour.

New Relic: AI in the service of predictive analytics

New Relic is the third APM Grand Champion of this selection. Unlike Dynatrace or AppDynamics, this publisher decided to take the cloud route from its creation in 2008. This SaaS orientation, which makes its solution accessible to the greatest number, would be one of the keys to its current success. Alongside its generic suite (New Relic APM), the company offers modules specifically dedicated to infrastructure monitoring, monitoring of mobile applications and the performance of web browsers. Like Dynatrace, New Relic is advancing full stack coverage to hybrid cloud environments and supporting Docker containers, microservices and serverless environments. And still like Dynatrace, New Relic uses artificial intelligence and machine learning to do predictive analysis and identify previously undetected anomalies. The price of New Relic APM depends on the type of instance chosen.

Unigma, a multicloud approach

Acquired last May by Kaseya, a publisher of IT systems management solutions for outsourcing service providers (MSP), Unigma offers a multicloud monitoring solution. The application is able to monitor the public clouds of AWS, Microsoft Azure and Google Cloud Platform. It is composed of three products. Cloud Manager is the monitoring tool itself. It makes it possible to trace the metrics of the different public clouds and to compare their respective performances. Based on this benchmark, the user can, with a second module called Cloud Cost Optimizer, make adjustments in the allocation of resources between different clouds. Finally, Cloud Billing Manager allows you to edit detailed billing by provider. The prices of these three modules are not public. It should be noted that in May, Kaseya entered into a partnership with Plenium Service Informatique, making this service company its exclusive reseller in France.

Vistara, for hybrid environments

Founded in California in 2014, Vistara offers a general approach to IT resource management, covering both hardware and software IT layers. An approach that allows it to position itself in hybrid environments, with applications that can, within the same organization, be hosted locally or in the cloud. The promise of Vistara is to be able to follow, in a same interface, the performances of the on-premise (or installed on infrastructures internal to the company), virtualized or in the cloud. Using an API, the solution can trace the technical indicators of AWS, Microsoft Azure or Google Cloud Platform. It also integrates with other monitoring tools like Nagios, New Relic or SolarWinds, or incident managers – such as BCM Remedy, Atlassian (Jira) or ServiceNow. Vistara has recently been referenced by Gartner and 451 Research. Again, the rates are not communicated.

Data Security In Cloud Computing

Businesses and governments are migrating more workloads to the cloud. However, some companies remain insensitive to the considerable strengths of the cloud amid persistent fears about data security in the context of cloud computing. Although their concerns are understandable and rightly implemented, data security in cloud computing can match or surpass the data security of traditional on-premise computing platforms.

Why can data security in cloud computing equal or surpass the data security of traditional IT platforms?

For starters, it’s important to answer the question What is the Cloud? According to the most common Cloud Computing definition, cloud providers make computing resources and applications available to users as a measured service that they can access over the Internet. Cloud services are typically categorized into Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), such as raw processing power or cloud storage.

By its very nature, Cloud Computing requires the user to give some control over to his service provider. But control and security are not synonymous. In fact, data security in cloud computing can be superior to traditional enterprise data centers for the same reasons that have a positive impact on the market as a whole: economies of scale and the division of labor.

How Akamai helps preserve data security in Cloud Computing

Akamai operates the world’s largest web content delivery network (CDN). This network consists of more than 160,000 servers in over 95 countries and broadcasts up to 30% of global Internet traffic. Akamai helps customers (service providers and businesses) deliver Web content and applications to their end users faster and more reliably. Akamai also offers a suite of integrated security solutions within its market-leading CDN. Akamai’s cloud-based security solutions protect against major cyber threats such as DDoS attacks, SQL injection and XSS attacks, as well as some less familiar Internet-based attacks.

Akamai’s security solutions are highly scalable and offer a globally distributed advanced defense line that detects and defeats attacks before they reach your data center. We offer solutions specifically designed for cloud service providers as well as solutions for organizations running applications in public clouds or using their own private cloud.

With a growing adoption of cloud computing technologies, security issues have become a real concern for businesses. Taking into account the unique nature of these problems, from the beginning of the implementation plan, is the guarantor of a permanent investment. This course provides you with deep and practical experience in identifying and resolving public and private cloud (cloud) security issues.

Does Oracle Want To Hide Its Weakness In The Cloud By Merging Its Results Iaas, Paas And Saas?

Oracle has reported the results of its quarterly activity in the cloud by adding revenue in SaaS, PaaS, and IaaS, rather than segmenting them as usual. Some see a willingness of the publisher to hide its weakness in hosted services.

Oracle has decided to change the way it communicates to the public, and especially its shareholders, about its financial performance in the cloud. So far, the publisher separated the turnover from this activity into two major components: those from the Saas, on the one hand, and those from PaaS and IaaS, on the other hand. Things have changed since the publication of the accounts for its fourth quarter of 2018. Now, all these revenues are accumulated and things do not stop there. During a conference call last week, Oracle co-CEO Safra Katz explained what her single cloud billing line consists of: “We combined SaaS billing, PaaS and IaaS with those generated by license updates, support for cloud services and license support. In other words, Oracle is not content to no longer segment its revenue in the cloud, it does not even report more specific line services hosted in its accounts.

The Real Share Of The Cloud Dropped In Q4

In the fourth quarter, aggregate billings from all of these businesses increased 8% to 60% of Oracle’s $ 11.3 billion (+ 3%) of total revenue. It did not really impress investors. The price of the Oracle action has indeed unscrewed by 7% after the publication of these figures, however, higher than the expectations of analysts. Perhaps because Oracle’s true cloud aggregation (SaaS, PaaS, and IaaS only) accounts for $ 1.7 billion in revenue, or 15 percent of overall business, compared to 18 percent. in the previous quarter. As everyone knows, Oracle has historically marketed database applications for on-premise deployment. For some time, he sees in the cloud a huge growth opportunity. In fact, the figures of its cloud activity were often well highlighted during its financial communications.

A Change Explained By The Launch Of The BYOL Option

Safra Katz justifies the change in reporting decided by Oracle by the recent introduction of the Bring Your Own License option in the publisher’s pricing model. “BYOL allows users to transfer their existing on-premise licenses to Oracle’s cloud as long as they continue to pay for support related to those licenses,” she explains. This model makes the purchase of new licenses more advantageous, even if they are only intended for use in the cloud. As a result, our license revenue is now a combination of revenue from new cloud licenses and new on-premise licenses. Our revenue from support is also a combination of cloud license support billing and on-premise license support. ”

Is that clear? Not really. The leader explains that because users take a hybrid approach, these revenues can not be segmented more finely and so they are all classified in the cloud box. From there to see in this wobbly argument a will of Oracle to want to hide his weakness in the cloud, there is only one step. Safra Katz denies: “Previously, all these licenses and associated support would have been fully accounted for in our on-premise income, which they are clearly not”.

Cloud Computing: The Advantages Of Computing In The Clouds

Even if the concept of cloud computing is not technologically innovative, the advantages that SMEs can now have are crucial.

If cloud computing – literally called “cloud computing” – has been making progress in the professional world for a year or two, the general public has been using it for a long time … without knowing it. Using the services of a webmail (Hotmail, Gmail, etc.) is a cloud activity. The notion of cloud computing represents the action of accessing computer resources stored on the Internet. Access that can be done for free – this is the case of webmail – or subscription, with a guaranteed level of service.

Cloud computing consists of moving, on remote servers, IT processes traditionally located on the user’s computer. On the professional side, the outsourcing of data has an additional notion with the public and private cloud, even “hybrid”. “When we talk about the public cloud, resources are available outside the company via the Internet,” says Régis Louis, senior director of product management, middleware merge, EMEA at Oracle. In the case of a private cloud, the infrastructure works for a single organization. It can be managed by the organization itself (internal private cloud) or by a third party (external private cloud). In the latter case, adds Régis Louis, “the infrastructure is entirely dedicated to the company and accessible via VPN (Virtual Private Network) secure networks. It is even possible to resort to a “hybrid” system, that is to say, a mix of public and private. ”

Globally, research firm IDC estimates that cloud services accounted for 5% of global ICT (information and communication technology) investments in 2009, or $ 17 billion. Stimulated by an average annual growth of 25%, cloud computing would capture 10% of global investments in 2013, or $ 44 billion. On a European scale and according to a study carried out, for the European Commission, by the firm PAC, the market of the cloud computing in the Europe of 27 weighed 4 billion euros in 2009. This amount represented about 1,5% of the total market for software and services, but could grow strongly and reach 13% of the total software and services market by 2015. In France, the firm Markess International estimates the market for hosting and cloud services to € 2.3 billion in 2011. It is expected to reach € 3.3 billion in 2013. “Due to the growing need for mobility services and the growing number of business terminals (laptops, tablets, smartphones, etc.), the future of the cloud is all traced, says Hélène Caraux, private cloud computing project manager at OVH, specialist web hosting. The cloud makes it possible to free itself from the hardware constraint and contributes to the refocusing of the company on its core business. “