Developpez.com

Plus de 14 000 cours et tutoriels en informatique professionnelle à consulter, à télécharger ou à visionner en vidéo.

Que penser de la plateforme de Cloud Computing ouverte de Sun Microsystems ?

Le , par *alexandre*, Inactif
Le Cloud Computing de Sun actuellement en phase de spécification me semble bien plus ouvert que Google App Engine (GAE)

De plus, quand on voit la liste des contributeurs ça motive !

Un petit lien sur les specs : http://kenai.com/projects/suncloudapis/pages/Home

Vous noterez que le but est non seulement de déployer des applications sur un cloud mais également l'élaboration d'un standard d'api permettant la communication entre les clouds

voici un guide qui présente l'architecture : https://www.sun.com/offers/details/CloudComputing.xml

Un petit exemple d'utilisation de Sun Cloud Computing : http://kenai.com/projects/suncloudapis/pages/HelloCloud

Contrairement à certains concurrents le cloud supportera des fichiers de tailles indéfinis : http://www.sun.com/jsp_utils/vid.jsp...start=true#vid

Hadoop supporte également l'api cascading :

Cascading provides means for defining arbitrarily large and complex, reusable, and fault tolerant data processing workflows, and a job planner for rendering those workflows into cluster executable jobs.

Cascading allows the developer to assemble predefined workflow tasks and tools, collect those workflows into a logical 'unit of work', and to efficiently schedule and execute them. Where these processes can scale laterally on clusters running in the local datacenter or on Amazon EC2.

The kinds of tasks and tools that can be built using Cascading can range from the simple 'log' parser, to modern Natural Language Processing (NLP). From traditional Extract, tranform, and load (ETL) to Data Warehousing. Even from Geophysical to Geospatial data managment.

Cascading currently relies on Hadoop to provide the storage and execution infrastructure. But the Cascading API insulates developers from the particulars of Hadoop, offering opportunites for Cascading to target different compute frameworks in the future without changes to the original processing workflow definitions.

Those familiar with Hadoop know it is an implementation of the MapReduce programming model. And any developer that has built any sort of application using MapReduce to solve 'real world' problems knows such applications can get complex very quickly. This is further aggravated by the need to 'think' in MapReduce throughout application development.

Thinking in MapReduce is typically unnatural, and tends to push the developer to constantly try to 'optimize' the application. This results in harder to read code, and likely more bugs. Further, most real world problems are a collection of dependent MapReduce jobs. Building them all and orchestrating them by hand does not scale well.

Cascading uses a 'pipe and filters' model for defining data processes. It efficiently supports splits, joins, grouping, and sorting. These are the only processing concepts the developer needs to think in.

During runtime, Cascading generates the minimum necessary number of MapReduce jobs, and executes them in the correct order locally, or on an Hadoop cluster. Any intermediate files are automatically cleared, and if target files already exist and aren't stale, those jobs can optionally be skipped.

We firmly believe applications should be built rapidly and designed as 'loosely coupled' as possible. Once an application is working and there are sufficient tests, only then should an application be optimized to remove any clear bottlenecks. Cascading supports this philosophy.

Cascading is also very suitable for 'ad-hoc' applications and scripts that might be needed to extract data from a Hadoop filesystem or to import data from various remote data sources. Or to just simply allow a user to poke around in various files and datasets.

Developers may also reuse existing Hadoop MapReduce jobs with Cascading, allowing them to participate with other Cascading dynamic MapReduce jobs on the cluster.

Read on about some of Cascadings key features.

Also see our documentation section for various examples and in depth tutorials listed on the sidebar.

Cascading is sponsored in part by these vendors:

YourKit is kindly supporting open source projects with its full-featured Java Profiler. YourKit, LLC is the creator of innovative and intelligent tools for profiling Java and .NET applications. Take a look at YourKit's leading software products: YourKit Java Profiler and YourKit .NET Profiler.

A lire aussi :
Que pensez vous de la solution OpenSource Hadoop ?
[Azure] jdotnetservices - Java SDK for Microsoft .NET Services
Une série d'articles sur GAE (Google App Engine)
Java sur le Google App Engine


Vous avez aimé cette actualité ? Alors partagez-la avec vos amis en cliquant sur les boutons ci-dessous :


 Poster une réponse

Avatar de *alexandre* *alexandre* - Inactif https://www.developpez.com
le 30/05/2009 à 17:29
L'implémentation va-t-elle utiliser Hadoop ? La prochaine conférence de Hadoop Summit '09 a au programme une présentation de Sun Cloud.
Avatar de *alexandre* *alexandre* - Inactif https://www.developpez.com
le 17/06/2009 à 23:15
Dans le whitepaper de Sun sur le cloud plusieurs choses m'interpellent. Notamment lorsqu' ils parlent de la prochaine stack de développement.

Ils ne mentionnent pas Glassfish comme serveur d'application mais lighthttp il ne voudrait pas se tirer une balle dans le pied quand même ???

Et je ne comprend pas non plus pourquoi mélanger deux files systèmes différents qui offrent les même possibilités (même moins pour MogileFS)

whitepaper ici après inscription : https://www.sun.com/offers/details/c...d_guide_button
Avatar de nicorama nicorama - En attente de confirmation mail https://www.developpez.com
le 25/06/2009 à 8:14
En plus c'est RESTful
Notons que l'innovation majeur de AppEngine était de faire payer à la minute de CPU utilisée, là où les concurrents demandent de prévoir à l'avance le nombre de CPU, de RAM et de bande passante.
Avec en plus une bonne dose de gratuité !

Esperons - j'en doute - que Sun fasse de même.
Offres d'emploi IT
Responsable transverse - engagement métiers H/F
Safran - Ile de France - Corbeil-Essonnes (91100)
Spécialiste systèmes informatiques qualité et référent procédure H/F
Safran - Ile de France - Colombes (92700)
Data scientist senior H/F
Safran - Ile de France - Magny-les-Hameaux (Saclay)

Voir plus d'offres Voir la carte des offres IT
Responsables bénévoles de la rubrique Java : Mickael Baron - Robin56 -