What Is Peer To Peer (P2P)?
The peer-to-peer (often abbreviated “P2P”) is a model of computer network near the client-server model but where each client is also a server. The most widespread application of peer-to-peer is file sharing. The advent of fast Internet connections (ADSL particular) without limit of time has contributed to this boom.
Each user is a peer network and resources are files. Everyone can share files and download files from others. These systems are very effective even when it comes to exchange large data volumes.
Among the most commonly used applications, we can distinguish BitTorrent and eMule.
Using the peer-to-peer calls for each node using a particular software. This software, which fills both functions of client and server are sometimes called (contraction of “server” and “customer” due to Gnutella), or more commonly, but in a narrow sense, “Customer. This is the origin of the word pairs (in English: peer) found in peer-to-Peer: the communication and exchanges occur between nodes that have the same responsibility in the system.
The peer-to-peer goes well beyond the file-sharing applications. It makes it possible to decentralize services and make available resources in a network. Any node of a network peer-to-peer may then propose items and get on the network. The Peer-to-peer can thus facilitate the sharing of information. They also make censorship or legal or pirate attacks more difficult. These advantages make the Peer-to-peer tools of choice for decentralizing services that should ensure high availability while providing low maintenance costs. However, these systems are more complex to design than client-server systems. Proposals using a peer-to-peer are applicable to more or less long term no longer used servers, among others:
- The DNS;
- The provision of software (Linux distributions like Mandriva, up-to-day Microsoft, World of Warcraft, etc.).
- Distribute multimedia content (streaming);
- E-mail software online.
The best-known application currently remains on file sharing through software for both client and server as eDonkey / eMule (eDonkey original protocol), FastTrack (used by KaZaA), etc.
However, the Peer-to-peer decentralized have more difficulties than client-server systems to disseminate information and coordinate the interconnection of nodes, thus ensuring low latency requests. Therefore, Peer-to-peer imposes a structure between the nodes connected to ensure low communication delays: it is structured decentralized systems. These systems are based on graph structures to interconnect the nodes. They were thus able to move servers to provide load balancing among the nodes in terms of:
- Control traffic received and sent by each node, which is to limit the number of nodes which is connected to each node;
- Number of requests sent to a node;
- Responsibility for access to shared objects in the network.
Finally, these systems can often use a routing graph similar to that which they are based, thereby reducing the number of request messages passing through the network.
The peer-to-peer should not be confused with the concept of point to point (Point-to-point in English), or with the Point to Point Protocol (PPP).
Peer To Peer Applications
Peer-to-peer does not make itself known as a principle but by the applications that could emerge as this new network model.
Peer-to-peer may be centralized (connections through an intermediate server) or decentralized (to make direct connections). It can be used for file sharing in peer to peer, scientific computing or communication.
Peer To peer — General Principle
The Peer-to-peer allow multiple computers to communicate over a network, easily share objects – files as often, but also continuous media stream (streaming), distributed computing, a service (such as telephony with Skype), etc., Internet.
The peer-to-peer has a decentralized system, previously based on some servers, allowing all computers to play the role of client and server (see client-server). In particular, file-sharing systems can make things more available they are popular, and thus replicated on a large number of nodes. This will then reduce the load (in requests) imposed at the nodes sharing popular files, which helps increase the number of nodes and then files in the network. This is called scaling.
A second application for a general audience or for research, but less widespread, however, that file sharing is the ability for users to make available some of their computational power.
Today’s computers are so powerful that most of the time a large part of their processor is available for calculations. The BOINC project took the opportunity to create a huge park in the distributed computing world to use this enormous computing power to complete calculations too complex to be performed in a laboratory.
The draft asks the BOINC particular allow the use of computing power that it does not immediately need to contribute to research on protein folding (Folding @ Home) and even the search for extraterrestrial intelligence by analysis of electromagnetic spectrum (SETI @ home).
Latest posts by Santosh (see all)
- Cloud Computing: The Concept and Examples of its Virtual Services | Part 1 - July 23, 2012
- Why Rapidly Growing Companies Need Cloud Computing | Part 1 - July 22, 2012
- Web Designing Process | Strategic Planning | Part 1 - August 7, 2011