Category Archives: Uncategorized

home about

About ICM Solutions

ICM was formed in 1989 by professionals with a global combined experience of more than 50 years in the fields of Information Technology, and Management. Since then, the software arm of the company has served several clients spread across US, India, Australia, Singapore, and Sweden. Till date, the company has developed hundreds of solutions spanning various technologies, size, and domains. In recent years, they have gained notable expertise in insurance, operational safety, and building maintenance systems… Read more

Mobile Applications

services-banner

Mobile Applications

A mobile app is a computer program designed to run on mobile devices such as smartphones and tablet computers. Most such devices are sold with several apps included as pre-installed software, such as a web browser, email client, calendar, mapping program, and an app for buying music or other media or more apps. Some pre-installed apps can be removed by an ordinary uninstall process, thus leaving more storage space for desired ones. Where the software does not allow this, some devices can be rooted to eliminate the undesired apps.

Apps that are not preinstalled are usually available through application distribution platforms, which began appearing in 2008 and are typically operated by the owner of the mobile operating system, such as the Apple App Store, Google Play, Windows Phone Store, and BlackBerry App World. Some apps are free, while others must be bought. Usually, they are downloaded from the platform to a target device, but sometimes they can be downloaded to laptops or desktop computers. For apps with a price, generally a percentage, 20-30%, goes to the distribution provider (such as iTunes), and the rest goes to the producer of the app.[1] The same app can therefore cost the average Smartphone user a different price depending on whether they use iPhone, Android, or BlackBerry 10 devices.

Development

Developing apps for mobile devices requires considering the constraints and features of these devices. Mobile devices run on battery and have less powerful processors than personal computers and also have more features such as location detection and cameras.

Developers also have to consider a wide array of screen sizes, hardware specifications and configurations because of intense competition in mobile software and changes within each of the platforms.

Mobile application development requires use of specialized integrated development environments. Mobile apps are first tested within the development environment using emulators and later subjected to field testing. Emulators provide an inexpensive way to test applications on mobile phones to which developers may not have physical access.

As part of the development process, mobile user interface (UI) Design is also an essential in the creation of mobile apps. Mobile UI considers constraints and contexts, screen, input and mobility as outlines for design. The user is often the focus of interaction with their device, and the interface entails components of both hardware and software. User input allows for the users to manipulate a system, and device’s output allows the system to indicate the effects of the users’ manipulation. Mobile UI design constraints include limited attention and form factors, such as a mobile device’s screen size for a user’s hand. Mobile UI contexts signal cues from user activity, such as location and scheduling that can be shown from user interactions within a mobile application. Overall, mobile UI design’s goal is primarily for an understandable, user-friendly interface.

Mobile UIs, or front-ends, rely on mobile back-ends to support access to enterprise systems. The mobile back-end facilitates data routing, security, authentication, authorization, working off-line, and service orchestration. This functionality is supported by a mix of middleware components including mobile app servers, Mobile Backend as a service (MBaaS), and SOA infrastructure.

Google Play

Google Play (formerly known as the Android Market) is an international online software store developed by Google for Android devices. It opened in October 2008. In August 2014, there were approximately 1.3+ million apps available for Android and the estimated number of applications downloaded from Google Play was 40 billion.

App Store

Apple’s App Store for iOS was not the first app distribution service, but it ignited the mobile revolution and was opened on July 10, 2008, and as of January 2011, reported over 10 billion downloads. The original AppStore was first demonstrated to Steve Jobs in 1993 by Jesse Tayler at NeXTWorld Expo As of June 6, 2011, there were 425,000 apps available, which had been downloaded by 200 million iOS users. During Apple’s 2012 Worldwide Developers Conference, Apple CEO Tim Cook announced that the App Store has 650,000 available apps to download as well as 30 billion apps downloaded from the app store until that date.From an alternative perspective, figures seen in July 2013 by the BBC from tracking service Adeven indicate over two-thirds of apps in the store are “zombies”, barely ever installed by consumers.

Data Analysis & Migration

services-banner

Data Analysis & Migration

Data migration is the process of transferring data between storage types, formats, or computer systems. It is a key consideration for any system implementation, upgrade, or consolidation. Data migration is usually performed programmatically to achieve an automated migration, freeing up human resources from tedious tasks. Data migration occurs for a variety of reasons, including server or storage equipment replacements, maintenance or upgrades, application migration, website consolidation and data center relocation.

To achieve an effective data migration procedure, data on the old system is mapped to the new system utilising a design for data extraction and data loading. The design relates old data formats to the new system’s formats and requirements. Programmatic data migration may involve many phases but it minimally includes data extraction where data is read from the old system and data loading where data is written to the new system.

After loading into the new system, results are subjected to data verification to determine whether data was accurately translated, is complete, and supports processes in the new system. During verification, there may be a need for a parallel run of both systems to identify areas of disparity and forestall erroneous data loss.

Automated and manual data cleaning is commonly performed in migration to improve data quality, eliminate redundant or obsolete information, and match the requirements of the new system.

Data migration phases (design, extraction, cleansing, load, verification) for applications of moderate to high complexity are commonly repeated several times before the new system is deployed.

Categories

Data is stored on various media in files or databases, and is generated and consumed by software applications which in turn support business processes. The need to transfer and convert data can be driven by multiple business requirements and the approach taken to the migration depends on those requirements. Four major migration categories are proposed on this basis.

Storage migration

A business may choose to rationalize the physical media to take advantage of more efficient storage technologies. This will result in having to move physical blocks of data from one tape or disk to another, often using virtualization techniques. The data format and content itself will not usually be changed in the process and can normally be achieved with minimal or no impact to the layers above.

Database migration

Similarly, it may be necessary to move from one database vendor to another, or to upgrade the version of database software being used. The latter case is less likely to require a physical data migration, but this can happen with major upgrades. In these cases a physical transformation process may be required since the underlying data format can change significantly. This may or may not affect behavior in the applications layer, depending largely on whether the data manipulation language or protocol has changed – but modern applications are written to be agnostic to the database technology so that a change from Sybase, MySQL, DB2 or SQL Server to Oracle should only require a testing cycle to be confident that both functional and non-functional performance has not been adversely affected.

Application migration

Changing application vendor – for instance a new CRM or ERP platform – will inevitably involve substantial transformation as almost every application or suite operates on its own specific data model and also interacts with other applications and systems within the enterprise application integration environment. Furthermore, to allow the application to be sold to the widest possible market, commercial off-the-shelf packages are generally configured for each customer using metadata. Application programming interfaces (APIs) may be supplied by vendors to protect the integrity of the data they have to handle.

Business process migration

Business processes operate through a combination of human and application systems actions, often orchestrated by business process management tools. When these change they can require the movement of data from one store, database or application to another to reflect the changes to the organization and information about customers, products and operations. Examples of such migration drivers are mergers and acquisitions, business optimization and reorganization to attack new markets or respond to competitive threat.

The first two categories of migration are usually routine operational activities that the IT department takes care of without the involvement of the rest of the business. The last two categories directly affect the operational users of processes and applications, are necessarily complex, and delivering them without significant business downtime can be challenging. A highly adaptive approach, concurrent synchronization, a business-oriented audit capability and clear visibility of the migration for stakeholders are likely to be key requirements in such migrations.

Project versus process

There is a difference between data migration and data integration activities. Data migration is a project by means of which data will be moved or copied from one environment to another, and removed or decommissioned in the source. During the migration (which can take place over months or even years), data can flow in multiple directions, and there may be multiple migrations taking place simultaneously. The Extract, Transform, Load actions will be necessary, although the means of achieving these may not be those traditionally associated with the ETL acronym.

Data integration, by contrast, is a permanent part of the IT architecture, and is responsible for the way data flows between the various applications and data stores – and is a process rather than a project activity. Standard ETL technologies designed to supply data from operational systems to data warehouses would fit within the latter category.

E-Commerce

services-banner

E-Commerce

E-commerce (also written as e-Commerce, eCommerce or similar variants), short for electronic commerce, is trading in products or services using computer networks, such as the Internet. Electronic commerce draws on technologies such as mobile commerce, electronic funds transfer, supply chain management, Internet marketing, online transaction processing, electronic data interchange (EDI), inventory management systems, and automated data collection systems. Modern electronic commerce typically uses the World Wide Web for at least one part of the transaction’s life cycle, although it may also use other technologies such as e-mail.

E-commerce businesses may employ some or all of the following:

  • Online shopping web sites for retail sales direct to consumers
  • Providing or participating in online marketplaces, which process third-party business-to-consumer or consumer-to-consumer sales
  • Business-to-business buying and selling
  • Gathering and using demographic data through web contacts and social media
  • Business-to-business electronic data interchange
  • Marketing to prospective and established customers by e-mail or fax (for example, with newsletters)
  • Engaging in pretail for launching new products and services

Forms

Contemporary electronic commerce involves everything from ordering “digital” content for immediate online consumption, to ordering conventional goods and services, to “meta” services to facilitate other types of electronic commerce.

On the institutional level, big corporations and financial institutions use the internet to exchange financial data to facilitate domestic and international business. Data integrity and security are pressing issues for electronic commerce.

Aside from traditional e-Commerce, the terms m-Commerce (mobile commerce) as well (around 2013) t-Commerce have also been used.

Global trends

In 2010, the United Kingdom had the biggest e-commerce market in the world when measured by the amount spent per capita. The Czech Republic is the European country where ecommerce delivers the biggest contribution to the enterprises´ total revenue. Almost a quarter (24%) of the country’s total turnover is generated via the online channel.

Among emerging economies, China’s e-commerce presence continues to expand every year. With 384 million internet users, China’s online shopping sales rose to $36.6 billion in 2009 and one of the reasons behind the huge growth has been the improved trust level for shoppers. The Chinese retailers have been able to help consumers feel more comfortable shopping online. China’s cross-border e-commerce is also growing rapidly. E-commerce transactions between China and other countries increased 32% to 2.3 trillion yuan ($375.8 billion) in 2012 and accounted for 9.6% of China’s total international trade In 2013, Alibaba had an e-commerce market share of 80% in China.

Other BRIC countries are witnessing the accelerated growth of eCommerce as well. Brazil’s eCommerce is growing quickly with retail eCommerce sales expected to grow at a healthy double-digit pace through 2014. By 2016, eMarketer expects retail ecommerce sales in Brazil to reach $17.3 billion. India has an internet user base of about 243.2 million as of January 2014. Despite being third largest user base in world, the penetration of Internet is low compared to markets like the United States, United Kingdom or France but is growing at a much faster rate, adding around 6 million new entrants every month. The industry consensus is that growth is at an inflection point. In India, cash on delivery is the most preferred payment method, accumulating 75% of the e-retail activities.

Mobile devices are playing an increasing role in the mix of eCommerce. Some estimates show that purchases made on mobile devices will make up 25% of the market by 2017. According to Cisco Visual Networking Index, in 2014 the amount of mobile devices will outnumber the number of world population.

Cloud Computing

services-banner

Cloud Computing

Cloud computing is a model for enabling ubiquitous, convenient, on-demand access to a shared pool of configurable computing resources. Cloud computing and storage solutions provide users and enterprises with various capabilities to store and process their data in third-party data centers. It relies on sharing of resources to achieve coherence and economies of scale, similar to a utility (like the electricity grid) over a network. At the foundation of cloud computing is the broader concept of converged infrastructure and shared services.

Cloud computing, or in simpler shorthand just “the cloud”, also focuses on maximizing the effectiveness of the shared resources. Cloud resources are usually not only shared by multiple users but are also dynamically reallocated per demand. This can work for allocating resources to users. For example, a cloud computer facility that serves European users during European business hours with a specific application (e.g., email) may reallocate the same resources to serve North American users during North America’s business hours with a different application (e.g., a web server). This approach helps maximize the use of computing power while reducing the overall cost of resources by using less power, air conditioning, rack space, etc. to maintain the system. With cloud computing, multiple users can access a single server to retrieve and update their data without purchasing licenses for different applications.

The term “moving to cloud” also refers to an organization moving away from a traditional CAPEX model (buy the dedicated hardware and depreciate it over a period of time) to the OPEX model (use a shared cloud infrastructure and pay as one uses it).

Proponents claim that cloud computing allows companies to avoid upfront infrastructure costs, and focus on projects that differentiate their businesses instead of on infrastructure. Proponents also claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and enables IT to more rapidly adjust resources to meet fluctuating and unpredictable business demand. Cloud providers typically use a “pay as you go” model. This can lead to unexpectedly high charges if administrators do not adapt to the cloud pricing model.

The present availability of high-capacity networks, low-cost computers and storage devices as well as the widespread adoption of hardware visualization, service-oriented architecture, and autonomic and utility computing have led to a growth in cloud computing. Companies can scale up as computing needs increase and then scale down again as demands decrease.

System Integration

services-banner

System Integration

In engineering, system integration is defined as the process of bringing together the component subsystems into one system and ensuring that the subsystems function together as a system. In information technology, systems integration is the process of linking together different computing systems and software applications physically or functionally, to act as a coordinated whole.

The system integrator brings together discrete systems utilizing a variety of techniques such as computer networking, enterprise application integration, business process management or manual programming.

Overview

A system is an aggregation of subsystems cooperating so that the system is able to deliver the overarching functionality. System integration involves integrating existing often disparate systems.

System integration (SI) is also about adding value to the system, capabilities that are possible because of interactions between subsystems

In today’s connected world, the role of system integration engineers is important: more and more systems are designed to connect, both within the system under construction and to systems that are already deployed.

Required skills

A system integration engineer needs a broad range of skills and is likely to be defined by a breadth of knowledge rather than a depth of knowledge. These skills are likely to include software, systems and enterprise architecture, software and hardware engineering, interface protocols, and general problem solving skills. It is likely that the problems to be solved have not been solved before except in the broadest sense. They are likely to include new and challenging problems with an input from a broad range of engineers where the system integration engineer “pulls it all together.

Methods of integration

Vertical integration (as opposed to “horizontal”) is the process of integrating subsystems according to their functionality by creating functional entities also referred to as silos.[7] The benefit of this method is that the integration is performed quickly and involves only the necessary vendors, therefore, this method is cheaper in the short term. On the other hand, cost-of-ownership can be substantially higher than seen in other methods, since in case of new or enhanced functionality, the only possible way to implement (scale the system) would be by implementing another silo. Reusing subsystems to create another functionality is not possible.

Star integration also known as spaghetti integration is a process of systems integration where each system is interconnected to each of the remaining subsystems. When observed from the perspective of the subsystem which is being integrated, the connections are reminiscent of a star, but when the overall diagram of the system is presented, the connections look like spaghetti, hence the name of this method. The cost varies because of the interfaces that subsystems are exporting. In a case where the subsystems are exporting heterogeneous or proprietary interfaces, the integration cost can substantially rise. Time and costs needed to integrate the systems increase exponentially when adding additional subsystems. From the feature perspective, this method often seems preferable, due to the extreme flexibility of the reuse of functionality.

Horizontal integration or Enterprise Service Bus (ESB) is an integration method in which a specialized subsystem is dedicated to communication between other subsystems. This allows cutting the number of connections (interfaces) to only one per subsystem which will connect directly to the ESB. The ESB is capable of translating the interface into another interface. This allows cutting the costs of integration and provides extreme flexibility. With systems integrated using this method, it is possible to completely replace one subsystem with another subsystem which provides similar functionality but exports different interfaces, all this completely transparent for the rest of the subsystems. The only action required is to implement the new interface between the ESB and the new subsystem.

The horizontal scheme can be misleading, however, if it is thought that the cost of intermediate data transformation or the cost of shifting responsibility over business logic can be avoided.

A common data format is an integration method to avoid every adapter having to convert data to/from every other applications’ formats, Enterprise application integration (EAI) systems usually stipulate an application-independent (or common) data format. The EAI system usually provides a data transformation service as well to help convert between application-specific and common formats. This is done in two steps: the adapter converts information from the application’s format to the bus’s common format. Then, semantic transformations are applied on this (converting zip codes to city names, splitting/merging objects from one application into objects in the other applications, and so on).

Application Development

services-banner

Web Application Development

Web applications are popular due to the ubiquity of web browsers, and the convenience of using a web browser as a client to update and maintain web applications without distributing and installing software on potentially thousands of client computers is a key reason for their popularity, as is the inherent support for cross-platform compatibility. Common web applications include webmail, online retail sales, online auctions, wikis and many other functions.

Interface

Through Java, JavaScript, DHTML, Flash, Silverlight and other technologies, application-specific methods such as drawing on the screen, playing audio, and access to the keyboard and mouse are all possible. Many services have worked to combine all of these into a more familiar interface that adopts the appearance of an operating system. General purpose techniques such as drag and drop are also supported by these technologies. Web developers often use client-side scripting to add functionality, especially to create an interactive experience that does not require page reloading. Recently, technologies have been developed to coordinate client-side scripting with server-side technologies such as PHP. Ajax, a web development technique using a combination of various technologies, is an example of technology which creates a more interactive experience.

Structure

Applications are usually broken into logical chunks called “tiers”, where every tier is assigned a role. Traditional applications consist only of 1 tier, which resides on the client machine, but web applications lend themselves to an n-tiered approach by nature. Though many variations are possible, the most common structure is the three-tiered application. In its most common form, the three tiers are called presentation, application and storage, in this order. A web browser is the first tier (presentation), an engine using some dynamic Web content technology (such as ASP, CGI, ColdFusion, Dart, JSP/Java, Node.js, PHP, Python or Ruby on Rails) is the middle tier (application logic), and a database is the third tier (storage). The web browser sends requests to the middle tier, which services them by making queries and updates against the database and generates a user interface.

For more complex applications, a 3-tier solution may fall short, and it may be beneficial to use an n-tiered approach, where the greatest benefit is breaking the business logic, which resides on the application tier, into a more fine-grained model. Another benefit may be adding an integration tier that separates the data tier from the rest of tiers by providing an easy-to-use interface to access the data. For example, the client data would be accessed by calling a “list_clients()” function instead of making an SQL query directly against the client table on the database. This allows the underlying database to be replaced without making any change to the other tiers.

There are some who view a web application as a two-tier architecture. This can be a “smart” client that performs all the work and queries a “dumb” server, or a “dumb” client that relies on a “smart” server. The client would handle the presentation tier, the server would have the database (storage tier), and the business logic (application tier) would be on one of them or on both. While this increases the scalability of the applications and separates the display and the database, it still doesn’t allow for true specialization of layers, so most applications will outgrow this model.

Software Development

Software development is the computer programming, documenting, testing, and bug fixing involved in creating and maintaining applications and frameworks involved in a software release life cycle and resulting in a software product. The term refers to a process of writing and maintaining the source code, but in a broader sense of the term it includes all that is involved between the conception of the desired software through to the final manifestation of the software, ideally in a planned and structured process. Therefore, software development may include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities that result in software products.

Software can be developed for a variety of purposes, the three most common being to meet specific needs of a specific client/business (the case with custom software), to meet a perceived need of some set of potential users (the case with commercial and open source software), or for personal use (e.g. a scientist may write software to automate a mundane task). Embedded software development, that is, the development of embedded software such as used for controlling consumer products, requires the development process to be integrated with the development of the controlled physical product. System software underlies applications and the programming process itself, and is often developed separately.

Software Development Activities

Identification of need

The sources of ideas for software products are legion. These ideas can come from market research including the demographics of potential new customers, existing customers, sales prospects who rejected the product, other internal software development staff, or a creative third party. Ideas for software products are usually first evaluated by marketing personnel for economic feasibility, for fit with existing channels distribution, for possible effects on existing product lines, required features, and for fit with the company’s marketing objectives. In a marketing evaluation phase, the cost and time assumptions become evaluated. A decision is reached early in the first phase as to whether, based on the more detailed information generated by the marketing and development staff, the project should be pursued further.

Because software development may involve compromising or going beyond what is required by the client, a software development project may stray into less technical concerns such as human resources, risk management, intellectual property, budgeting, crisis management, etc. These processes may also cause the role of business development to overlap with software development.

Planning

Planning is an objective of each and every activity, where we want to discover things that belong to the project. An important task in creating a software program is extracting the requirements or requirements analysis. Customers typically have an abstract idea of what they want as an end result, but do not know what software should do. Skilled and experienced software engineers recognize incomplete, ambiguous, or even contradictory requirements at this point. Frequently demonstrating live code may help reduce the risk that the requirements are incorrect.

Once the general requirements are gathered from the client, an analysis of the scope of the development should be determined and clearly stated. This is often called a scope document.

Certain functionality may be out of scope of the project as a function of cost or as a result of unclear requirements at the start of development. If the development is done externally, this document can be considered a legal document so that if there are ever disputes, any ambiguity of what was promised to the client can be clarified.

Designing

Once the requirements are established, the design of the software can be established in a software design document. This involves a preliminary, or high-level design of the main modules with an overall picture (such as a block diagram) of how the parts fit together. The language, operating system, and hardware components should all be known at this time. Then a detailed or low-level design is created, perhaps with prototyping as proof-of-concept or to firm up requirements.

Implementation, Testing and Documenting

Implementation is the part of the process where software engineers actually program the code for the project.

Software testing is an integral and important phase of the software development process. This part of the process ensures that defects are recognized as soon as possible. In some processes, generally known as test-driven development, tests may be developed just before implementation and serve as a guide for the implementation’s correctness.

Documenting the internal design of software for the purpose of future maintenance and enhancement is done throughout development. This may also include the writing of an API, be it external or internal. The software engineering process chosen by the developing team will determine how much internal documentation (if any) is necessary. Plan-driven models (e.g., Waterfall) generally produce more documentation than Agile models.

Deployment and maintenance

Deployment starts directly after the code is appropriately tested, approved for release, and sold or otherwise distributed into a production environment. This may involve installation, customization (such as by setting parameters to the customer’s values), testing, and possibly an extended period of evaluation.

Software training and support is important, as software is only effective if it is used correctly

Maintaining and enhancing software to cope with newly discovered faults or requirements can take substantial time and effort, as missed requirements may force redesign of the software.

About Us

about-banner-3

ICM Solutions History

In 1989, ICM Computer Consultants was formed by professionals who, today, have a combined experience of 50 years in IT projects and management in the US and in India to offer both hardware and software solutions to the market.

Later on, in 1994, the promoters separated the software division into a new firm called ICM Exports to cater to software markets outside of India. Today, ICM Exports is called as ICM Solutions and it has clients spread across US, Australia, Singapore, and Sweden.

ICM has the technical competence and management commitment to ensure that its clients achieve their corporate goals We understand the clients’ requirements, constraints, priorities, etc. and focus on coming up with robust solutions that meet their business needs in every engagement. Our technical competence, effective and open communication, and ethical business practices are our strengths that help us to effectively complete in the marketplace.

Our Management

narayanan
Mr. V Narayanan, CEO

V. Narayanan, is the founder of ICM Computer Consultants and heads all the various group companies of ICM. He received his MBA from the US. He is also a CPA (Texas) and a CMA. He worked in the US for over 13 years and held various management positions in Fortune 500 companies. Prior to forming ICM he was with Lone Star Gas, USA, as Vice President/Treasurer. He was the youngest officer of the company and was instrumental in saving millions by effectively automating various company activities. He provides vision and direction for the company and also the Chief Financial Officer of the company.

kalayani
Mrs. Kalyani Narayanan

Ms N Kalyani has served as the CEO of Allfon Systems, a US based multinational software start-up Company for over 5 years before assuming her position at ICM Solutions. Prior to Allfon, she was with another multinational IT Solutions company, Covansys (which is now part of CSC). She has obtained her degree from the University of Houston, Texas, and worked for various companies in the US such as Sprint, and Pennzoil before relocating back to India. Ms. Kalyani has managed several software projects under various technologies, from mainframe to client server to web based solutions, both in the US and in India.

Message from CEO

Our business objective is that ICM Solutions should help our customers to effectively compete in their businesses by being their “Partner of Choice” for software solutions and related services.

This requires a deep sense of commitment and high level of energy. With careful planning and close monitoring, we are able to help our customers get closer to achieving their goals. Our Project managers are those who do not fear change and who constantly tackle every customer’s pain with creative and effective solutions. By constantly motivating them and empowering them with the required support, they are able to do things smarter, faster, and better thus keep enhancing our value for money to our customers.

From the management perspective, I see our path to reach the goal lies in constantly managing three aspects in our engagements:

  • Find out where we are – by actively seeking client feedback and analyzing our own performance metrics
  • Figure out where we should be – set of achievable objectives based on where we are and enhancing our processes that would allow us to accomplish that
  • Be there – by effectively communicating this to the entire team and by carefully measuring and monitoring constantly to ensure that we get to where we want to be.

Of course, this is not a single cycle process but an iterative process that constantly engages everyone in the company to relentlessly show improvements. It is my duty and responsibility as the CEO of ICM Solutions that the company’s objective is met where everyone – the clients, the employees, and the community – come out as winners.

So far, in our 20+ years of being in business, many clients have helped us to serve them better and the relationship has promoted us from a mere “client – vendor” relationship to one of “partnership”. This indicates that we are on the right path and we look forward to more such partnerships.

Thank you for your support.
V Narayanan