top of page


Turning a New Page

Business Problem |

A national wireless provider wished to improve and expand an internal system used for network performance monitoring and reporting, which helps the company gain a competitive advantage over other wireless providers. The goal was to deliver hundreds of reports to the user community within an eight-month period. As the users are technically proficient and their needs are mission-critical, it was also important that the system is of high quality – reliable, high performing, and full-featured. However, the budget for software development was capped, making project execution a challenge.

They approached Bright Byte to take charge of the system analysis and development effort. They chose Bright Byte as we have a proven track record for making aggressive development dates and have an extensive network for finding good talent. We use an agile software development process that focuses on delivery and are unencumbered by the heavy processes associated with large companies.

Our Approach |

Bright Byte’s approach was tailored to address the three major challenges on this project.

• Cost: To maximize the budget dollars, we put together a project team on two continents. A team of five Analysts and a Project Manager were based in the US, co-located with the customer’s user base. The Analysts had technical backgrounds (three had been programmers earlier in their careers) with excellent requirements gathering, design, and testing skills, and were equally comfortable communicating with business users as with software developers. A team of six Software Developers, two Testers, and a Project Manager were based in Romania. These were senior people with a successful track record of providing software development services to overseas clients.

• Scope: We did some estimation and modeling to demonstrate to the customer that it was highly unlikely that 100% of the report requirements would be fulfilled in the eight-month period with the team in place. After a discussion with management, it was decided that the scope would be reduced to about 65% of the original ask.

• Process: The users depend on the reports to function in their jobs. We needed to deliver quickly, and often. We prioritized the report requirements and split the list into one-month phases, published the phased list to the user community, and organized the process and the team to deliver on a monthly cycle. We put in place an agile process that revolved around three types of documentation, heavy phone/email communication, daily conference calls, daily project plan updates from each team member, and weekly status updates to management, with software releases to production at the end of each month. Analysis and Design was done in the US, Development and Unit Testing was done in Romania, System Testing and Deployment was done back in the US. This process allowed us to deliver quickly and with higher quality than otherwise thought.

The customer required Bright Byte to use the Solaris operating system, the Oracle database platform, and the Cold Fusion as the rapid application development tools, supported by certain Java modules. The US software environment was largely replicated in Romania to help reduce issues when completed code from Romania arrived in the US for testing.

The Result |

We successfully delivered eight phases of the project within the eight month period. We delivered 95% of the reduced scope; the reports not delivered were those that the customer considered unnecessary or ill-defined. The users appreciated the number of reports made available to them, as well as our attentiveness to their needs. The managers appreciated the fact that we completed this highly visible project on time and on budget

On the strength of our performance over the first eight months, Bright Byte won a major extension on the project. During this period, we built up a team of Software Developers in the US, and the Romanian resources transitioned their code and knowledge to this team and then moved on to a different assignment. We reviewed all aspects of the project to determine areas of improvement. The pace of development in the first eight months had been so frenetic that code reviews had been paid scant attention; we now spent some time on code re-factoring to improve quality and to modularize and standardize various functionalities. We rebalanced the team to improve overall throughput – by juggling the number of Analysts, Developers, Testers, and bringing on some Oracle Database Developers. We introduced instant messaging to improve communication and web-based collaboration tools to keep team members on the same page. We kept the agile process but changed to a six week release cycle to allow more time for Design. These changes led to a marked improvement in delivery capability for the team as a whole.

bottom of page