Within the scope of this research, we will elaborate on network performance management in an organization from IT perspective. For a company to be successful, a strategic computer network plan should start with defining the services that the computer network will offer the line operations of its business for optimizing operations. Services means the network application protocols the network will support, like IP (Internet protocol) Systems Network Architecture (SNA). (Jensen 1998) Most large companies utilize a large number of application protocols, but companies are reducing that number to simplify support and reduce capital equipment investment. If a company has only one protocol (usually IP), it is ahead of the game. If not, it will have to develop an interim service basis protocol and a strategy for converging company applications on a common approach. Essentially, a service basis protocol is a network protocol that forms a common lower protocol layer to all of the application protocols that the company is currently using.
Basically, a company’s goal is to harmonize current usage on this service protocol, then work to evolve applications toward a more suitable application protocol, such as IP. Having a common low-level protocol is easier in the near term than doing a change of applications to IP. (Barr 2006) Once a service basis has been established, there is need to publish that protocol to the computing and information applications that use the network and support the business. To publish means defining the rules by which applications, users, and equipment can attach to the network using that protocol. These rules usually consist of specifications for client and server hardware adapters, software, and so on, and will also define addressing conventions, device setup, and other variable parameters. Finally, there is a need to get the above process connected with the ongoing business planning process. Many companies fail to keep their networks and computer organizations up-to-date with business trends, including smart business systems, when those trends may impact their networks at a later point. (Weizner 2005)
An integral part of computer networking is content management. It is the end-to-end process of acquisition, collaboration within the enterprise to be able to manage the content, and then the dissemination of that content to the Web site, to partners, to channels, to syndication, and to other organizations. While content management’s roots today may lie in document management to a large degree, its future will likely lie on the Web and beyond as its evolution pushes the concept of what content is and how it can be used for E-business. The Web gave content management a boost as companies began to realize that although running a business on the Web has many benefits, it also requires making content useful and relevant on-line. Companies are finding a need to collaborate around content, and that often means bringing together users and content from different parts of the globe.
As content management takes more of a central role in business, it must be extended to include more content sources and access methods and different kinds of content. (Weizner 2005) Content itself is still, of course, very important, but it cannot be seen in isolation, reduced to just text and graphics. Users must look at things like more active content and be able to manipulate that kind of content on-line. Content management has been limited to Web content management, but in the future will cover the entire enterprise. From this view, it can be called enterprise content management where the user can bring in the Web content, the documents that are inside and outside the company, into one content management infrastructure that is global in nature and distributed in structure.
From this view, well-planned content management strategically distributes information within the company and outside the company via the Web. Decision makers must know what is needed for E-business across the enterprise. There must be a strategy to do that. It must go beyond just data marts and data warehouses. The bottom line is that there must be a strategy to manage enterprise content management that goes beyond the company’s boundaries for a truly optimized enterprise.
One way of viewing a company’s data infrastructure is from an enterprise storage network (ESN) perspective in order to better optimize a company’s operations. The concept is simple enough—improve access to the large volumes of data residing in disparate storage systems. In effect, ESN is an architecture used to extend the reach and flexibility of a data infrastructure in order to leverage its value to more of the organization. As such, an ESN architecture consists of both enterprise storage and storage networking technologies to deliver a common way to manage, protect, and share information and knowledge regardless of distance or scale. (Berninger 1999)
When designing and implementing an ESN, there is a need to take into account how much growth, the number of connections, information security, how it will scale, availability, performance, and the services required. It is important to note that ESN represents an inclusive strategy. It includes storage area networks (SANs), network attached storage (NAS), and direct attached connections because the needs of most practical situations cannot be met by a single connection topology. (Barr 2006) ESN encompasses both SANs and NAS in order to address the demands of realistic storage environments.
Offering access to large volumes of data, SANs provide high-speed network storage that is external from processing servers. Separating a SAN from the server greatly improves performance since data storage and access tasks are removed. Generally, SANs connect with Fiber Channel allowing them to share access to multiple storage devices such as tape or RAID (redundant array of independent/inexpensive disks) systems. (Weizner 2005) The concept of SAN breaks away from the traditional network environment. In the past, storage devices were attached to the network via an available server. This limited the data that could be housed on a particular storage device. Since the server would run a designated operating system, the storage device that was attached to that server would only be able to store information from that operating system. A SAN environment changes that concept entirely; multiple servers with different operating systems can share the same storage devices.
Going beyond storage area networks, there is network attached storage. Actually, they complement each other to form a complete enterprise-wide storage system. NAS provides distributed storage for workgroups and departments by helping companies avoid having dozens of servers to administer and maintain. Still, NAS does not get rid of all administration costs. Network administrators still have to set up work groups and assign permissions and passwords for each NAS device. There are also times when data needs to be made available to an entire enterprise. When there are more workers accessing an NAS device than reside on its subnet, it is time to build a storage area network with multiple clustered servers and storage devices.
SANs differ from NAS in that they are composed of storage devices and servers connected via high-speed network connections, usually Fiber Channel or possibly gigabit Ethernet. (Berninger 1999) SANs can be server hosted where the software needed to operate the storage devices resides on one or more of the servers or they can be peer-to-peer, where each storage device on the network manages itself with its own thin server. Clustered servers route data to users on a WAN (wide area network). SANs are scalable and extensible. They are solutions for companies that need large, centralized storage repositories for active archives or backups. On the other hand, NAS is for distributed storage for workgroups and SANs are best for centralized enterprise-wide use. Both complementary technologies have a place in the typical company within an SBS operating mode. (Barr 2006)
Additionally, the power, reliability, and extensibility of SANs are of great importance to implement E-commerce in every industry segment. However, the level of complexity typical of E-commerce and SAN systems results in major challenges to management. Even more important, managers are discovering that being able to see and manage their entire solution’s performance is a critical success factor when it comes to realizing the benefits that a reliable, redundant SAN can bring them.
Today, there are a number of vendors to help companies manage their networks more easily and less expensively. The focus is giving companies a better handle on whether they are making optimal use of their network and application resources. For example, the Flame Thrower product family from Antara.net is software designed to optimize network capacity and test how network devices will perform in high volumes of traffic. Other software include the following: Enterprise Perspective from Keynote manages Web application performance on either side of the firewall and displays results on existing management consoles; nGenius Capacity Planner 3.0 from NetScout is useful for long-term network planning, managing application service levels from the end-user perspective and customizing performance reports; FastNet from NetTest is network hardware with embedded software that increases the speed and quantity of traffic on a network. Lastly, RiskWise product suite from Pare Technologies manages E-business networks and provides traffic analysis, resilience analysis, and topology optimization. (Weizner 2005)
In the area of wireless and mobile technology, many users are experiencing long delays when using it. For example, field representatives for a water utility spent part of their days waiting for the sluggish Cellular Digital Packed Data (CDPD) modems in their laptops and PDAs to transmit and retrieve information. When the system managers figured out that it was taking up to four minutes to transmit a single completed service ticket, they knew that they needed a solution that would move their wireless data much faster. A number of vendors, including startups BlueKite, BroadCloud, Fourelle, Idetic, and XOSoft, as well as the more established provider Inktomi have responded by developing data optimization software that speeds up data transmission over existing cellular wireless networks. (Barr 2006) The software is expensive, but for some businesses, the time saved is worth the investment. Wireless carriers like Sprint PCS, Verizon Wireless, and Britain’s BT Cellnet are investing in the optimization packages and passing on the performance gains to their customers. (Weizner 2005)
Most difficulties with sending data over a cellular wireless network can be traced back to a single source: wired technology pressed into service in a wireless world. Specially designed wireless applications and protocols like the Wireless Application Protocol (WAP) are built in to deal with the slow speeds and unpredictability of radio-based data transmissions. But many companies are still trying to run applications and network protocols designed for the relatively high bandwidth and consistent connections available over fixed wires.
Data optimization systems try to smooth out these differences by placing a small software client on the wireless device and a server on the other side of the wireless connection. The client and the server, linked together, act as a conduit for data flowing to and from the wireless device, compressing the data, removing extraneous information, and simply maintaining the link between the wireless device and the network. At this time, data optimization software is the only real answer to solving data transmission speed and reliability problems. Businesses that need complex data delivered speedily now have a way to end the waiting.