Photograph of Prof. Sugata Sanyal
Invited Talks

Research experience of Sugata Sanyal 

Recent Work (2007 onwards)

  1. PRIDES: Power Resilient Intrusion Detection System For Detecting Sleep Deprivation in Geo-Sensor Network (with Tapolina Bhattasali, Rituparna Chaki)
    Rapid growth of ubiquitous wireless communication technology enables critical sensor applications such as environmental hazard warnings because wireless sensor network technology is easy to deploy, anytime and anywhere, without any fixed infrastructure, whereas other types of network may not be configured in remote or dangerous environments. Heterogeneous wireless geo-sensor networks are more suitable for real life applications compared to its homogeneous counterpart. Wireless Geo-Sensor Network (WGSNET) is less secure because sensor nodes are deployed in hostile environment where major processes may not be analyzed accurately due to problem of accessibility. Denial of service in geo-sensor network may give rise to unavoidable circumstances. A particularly devastating attack in this category is sleep deprivation attack where target of the intruder is to maximize the power consumption of sensor nodes, so that they stop working. Intrusion detection system is one of the major and efficient defensive methods against attacks in sensor network. Because of different constraints of sensor networks, security solutions have to be designed with limited usage of computation and resources. In this paper, we have presented a brief review of the state of the art security scenario for WGSNETs. Finally, a cluster based five-tiered framework namely PRIDES model has been proposed for heterogeneous wireless geo-sensor network to efficiently mitigate sleep deprivation attack. Simulation results exhibit the effectiveness of the proposed PRIDES model in WGSNET.
  2. A New Trusted and Collaborative Agent Based Approach for Ensuring Cloud Security (with Shantanu Pal, Sunirmal Khatua, Nabendu Chaki)
    To determine the user's trust is a growing concern for ensuring privacy and security in a cloud computing environment. In cloud, user's data is stored in one or more remote server(s) which poses more security challenges for the system. One of the most important concerns is to protect user's sensitive information from other users and hackers that may cause data leakage in cloud storage. Having this security challenge in mind, this paper focuses on the development of a more secure cloud environment, to determine the trust of the service requesting authorities by using a novel VM (Virtual Machine) monitoring system. Moreover, this research aims towards proposing a new trusted and collaborative agent-based two-tier framework, titled WAY (Who Are You?), to protect cloud resources. The framework can be used to provide security in network, infrastructure, as well as data storage in a heterogeneous cloud platform. If the trust updating policy is based on network activities, then the framework can provide network security. Similarly, it provides storage security by monitoring unauthorized access activities by the Cloud Service Users (CSU). Infrastructure security can be provided by monitoring the use of privileged instructions within the isolated VMs. The uniqueness of the proposed security solution lies in the fact that it ensures security and privacy both at the service provider level as well as at the user level in a cloud environment.
  3. QoS Routing using OLSR with Optimization for Flooding (with Suman Banik, Bibhash Roy, Parthi Dey, Nabendu Chaki)
    Mobile Ad-hoc Network (MANET) is the self organizing collection of mobile nodes. The communication in MANET is done via a wireless media. Ad hoc wireless networks have massive commercial and military potential because of their mobility support. Due to demanding real time multimedia applications, Quality of Services (QoS) support in such infrastructure less networks have become essential. QoS routing in mobile Ad-Hoc networks is challenging due to rapid change in network topology. Consequently, the available state information for routing is inherently imprecise. QoS routing may suffer badly due to several factors including radio interference on available bandwidth and inefficient flooding of information to the adjacent nodes. As a result the performance of the network degrades substantially. This paper aims at the solution for energy efficient QoS routing by best utilization of network resources such as energy and bandwidth. A comparative study shows that despite the overhead due to QoS management, this solution performs better than classical Optimized Link State Routing Protocol (OLSR) protocol in terms of QoS and efficient utilization of energy.
  4. Modeling Smart Grid using Generalized Stochastic Petri Net (with Amrita Dey, Nabendu Chaki)
    Building smart grid for power system is a major challenge for safe, automated and efficient usage of electricity. The full implementation of the smart grid will evolve over time. However, before a new set of infrastructures are invested to build the smart grid, proper modeling and analysis is needed to avoid wastage of resources. Modeling also helps to identify and prioritize appropriate systems parameters. In this paper, an all comprehensive model of smart grid have been proposed using Generalized Stochastic Petri Nets (GSPN). The model is used to analyze the constraints and deliverables of the smart power grid of future.
  5. Design and development of efficient encryption scheme for Wireless Sensor Networks (with Dipankar Dasgupta, Department of Computer Science, University of Memphis, USA)
    In a prior work on Jigsaw-based encryption algorithm, it was shown that the algorithm was less complex (in terms of number of operations) than the standard AES algorithm; however, we did not demonstrate by any quantitative measure that it was as secure as AES. Further work will investigate (a) the security aspect of the Jigsaw-based encryption algorithm and demonstrate its use in Wireless Sensor Networks (WSN), (b) the performance in wireless channels, and (c) the strength of security against differential cryptanalysis and Brute force attacks. We will also conduct research on Immunology-derived decision support systems for WSN: by monitoring the security situation using multi-level sensory information and applying the acquired knowledge in ad hoc and/or sensor networks. Accordingly, multiple sensory data will be analyzed to generate actions that can be used to either redefine the route (to avoid sink hole) or even to specifically recruit "special" responses.
  6. Mobile Agent Security (with Marcin Paprzycki, Systems Research Institute of the Polish Academy of Sciences, Poland)
    There exist numbers of scenarios where software agents are considered to hold promise for the future of computing. One of them is the vision of software agents utilized in the context of development and implementation of large complex systems. Here, the benefits are grounded in basic principles of software engineering (i.e. decomposition, abstraction and organization) and include faster creation, easier maintenance, scalability, and an overall ease of deployment of complex distributed systems. Separately, agent approach is expected to play a crucial role when dealing with information overload. Here, intelligent software agents are to learn user preferences and act upon them in finding all and only such information that a given user is going to be interested in. Finally, software agents are also very often mentioned in the context of e-commerce, where they are to play a very important role in support of automatic price negotiations and purchase automation. However, one of the key problems that slow down widespread use of software agents is their security. Note that two aspects of software agents make them vulnerable: mobility (agents go to other computers, where they can be "captured") and replay-ability (captured agents can be executed over and over again in a black-box environment, leading naturally to possibility of their re-engineering).
  7. Study of Complexity and Intelligence of Biological and Artificial Complex Systems (with Barna Iantovics, Petru Maior University of Tg. Mures, Romania)
    The purpose of the research consists in obtaining more complete theories related with complexity, intelligence, hybrid decision making, and scaling mechanisms for complex systems specialized in difficult problems solving in generally, hybrid medical systems particularly, which will represent the basis of future hybrid medical systems developments, which could cooperatively solve difficult tasks in the medical domain and will have minimal complexity. Hybrid medical systems include humans, like: physicians, nurses, patients etc; computational agents and different medical devices having sensors and effectors; that as a whole has a high complexity and can fulfill task, like: medical decision support, medical diagnosis, computational epidemiology, and tasks in healthcare (patients' monitoring for example).
  8. Shortened Hamming Codes Maximizing Double Error Detection (with Mario Blaum, Universidad Complutense de Madrid & IBM, San Jose, California)
    Derived algorithm generates a single error correcting code with a maximum partial double error detection capability for increased protective redundancy without the need for an extra check bit. Since the columns of the parity-check matrix of the code do not contain all possible non-zero vectors, the code has a residual property for double error detection. There are possibilities for the choice of the columns of the parity-check matrix of the shortened Hamming code. Of all possible choices of columns, there is one (not necessarily unique) that maximizes the number of double errors that the code can detect. The purpose of this work is to find a general solution and proving that such a solution is optimal, that is, for the particular choice of parameters, our shortened Hamming code maximizes the number of double errors that can be detected. We are further studying the applicability of this conjecture on other codes to see if they can provide higher return at no or very low extra cost.
  9. New Frontiers of Network Security: The Threat Within (with Amit Gupta, Nevis Networks, India) Nearly 70% of information security threats originate from inside the organization. The instances of insider threats have been increasing at an alarming rate with the latest trends of mobility (portable devices like Laptop, smart phones and iPads etc), ubiquitous connectivity (wireless connectivity) and this trend increases as more and more web-based applications are made available over the Internet. The Insider threats are generally caused by current or ex-employees, contractors or partners, who have authorized access to the organization's network and servers. Theft of confidential information is often for either material gain or for willful damage. The net result is losses worth millions of dollars in terms of IP theft, leakage of customer / individual information, etc. This work presents an understanding of the Insider threats, attackers and their motives and suggests mitigation techniques at the organization level. This methodology is being studied for application in different DAE organizations for providing better security.
  10. Fundamentals of Digital Logic (Book)(with Bijoy Bandyopadhyay, Institute of Radio Physics & Electronics, Calcutta University, Kolkata) This book covers Various Number Systems, Boolean Algebra and Logic Gates, Semiconductor Devices, Combinatorial Circuits (Multiplexers etc.), Flip-Flops, Analysis and Synthesis of Sequential Circuits, Timing Circuits, Semiconductor Memory, A to D and D to A Converter. All applicable chapters will have examples in VHDL code.
  11. A Note on the Bounds for the Generalized Fibonacci-p-Sequence and its Application in Data-Hiding (with Sandipan Dey, Microsoft, India; Hameed Al-Qaheri, Kuwait University; Suneeta Sane, VJTI, India) We first suggest a lower and an upper bound for the Generalized Fibonacci-p-Sequence, a generalization of the Classical Fibonacci sequence, for different values of p. We show that the ratio of two consecutive terms in generalized Fibonacci sequence converges to a p-degree polynomial and then prove the bounds for generalized Fibonacci-p sequence, thereby generalizing the exponential bounds for classical Fibonacci sequence. We show how these results can be used to prove efficiency for data hiding techniques using generalized Fibonacci sequence. These steganographic techniques use generalized Fibonacci-p-Sequence for increasing number available of bit-planes to hide data, so that more data can be hidden into the higher bit-planes of any pixel without causing much image distortion. This bound can be used as a theoretical proof for efficiency of those techniques; it explains why more data can be hidden into the higher bit-planes of a pixel, without causing considerable decrease in PSNR.
  12. Embedding Secret Data in HTML Web Page (with Sandipan Dey, Microsoft, India; Hameed Al-Qaheri, Kuwait University) We suggest a novel data hiding technique in an html webpage. Html tags are case insensitive and hence an alphabet in lowercase and uppercase present in html tag is interpreted in the same manner by the browser. We exploit this case-redundancy and imperceptibility of the changes in case-conversion of html tags from point of view of the browser. The embedded data can easily be recovered by viewing the source of the html page. This technique can easily be extended to embed secret data in any piece of information where the standard interpreter of that information is case-insensitive.
  13. A Multifactor Secure Authentication System for Wireless Payment (with Ayu Tiwari, IIIT, Allahabad, India and Sudip Sanyal, IIIT, Allahabad, India) Organizations are increasingly deploying wireless based online payment applications. Existing internet based authentication systems use either the Web or the Mobile channel individually to confirm the claimed identity of remote user. The vulnerability is that, access based on only single factor authentication, is not secure enough to protect user data. We propose a new protocol based on multifactor authentication system. It uses a novel approach based on Transaction Identification Code and SMS to enforce another security level with the traditional Login/password system. The system is simple to use and deploy with a limited resources that does not require any change in infrastructure or underline protocol of wireless network. This Protocol is extended as a two way authentications system to satisfy the emerging market need of mutual authentication and also supports secure B2B communication. This increases faith of the user and business organizations on wireless financial transaction using mobile devices.
  14. Data Hiding Techniques Using Prime and Natural Numbers (with Sandipan Dey, Cognizant Technology Solutions, India; Ajith Abraham, Norwegian University of Science and Technology, Norway, Bijoy Bandyopadhyay, University of Calcutta, India) Our data hiding techniques are improvements over classical LSB data hiding and Fibonacci LSB data hiding technique. We propose two novel embedding techniques, special-cases of our generalized model. The first embedding scheme is based on decomposition of a number (pixel-value) in sum of prime numbers, while the second one is based on decomposition in sum of natural numbers. They not only allow one to embed secret message in higher bit-planes but also do it without much distortion, with a much better stego-image quality, in a reliable and secured manner. Theoretical analysis indicates that image quality of the stego-image hidden by the technique using Fibonacci decomposition improves against simple LSB substitution method, while the same using the prime decomposition method improves drastically against that using Fibonacci decomposition technique, and finally the natural number decomposition method is a further improvement against that using prime decomposition technique.
  15. Steganography and Steganalysis: Different Approaches (with Soumyendu Das, Information Security Consultant, India; Subhendu Das, STQC IT Services, India; Bijoy Bandyopadhyay, University of Calcutta, India) Steganography is the technique of hiding confidential information within any media. Steganography is often confused with cryptography because the two are similar in the way that they both are used to protect confidential information. The difference between the two is in the appearance in the processed output; the output of Steganography operation is not apparently visible but in cryptography the output is scrambled so that it can draw attention. Steganalysis is the process to detect of presence of Steganography. We have elucidated the different approaches towards implementation of Steganography using 'Multimedia' file and Network IP datagram, as cover. Also some methods of Steganalysis have been discussed.
  16. Evolution Induced Secondary Immunity - An Artificial Immune System based Intrusion Detection System (with Divyata Dal, Siby Abraham, Mukund Sanglikar, University of Mumbai; Ajith Abraham, Norwegian University of Science and Technology, Norway) The analogy between Immune Systems and Intrusion Detection Systems encourage the use of Artificial Immune Systems for anomaly detection in computer networks. We describe a technique of applying Artificial Immune System along with Genetic algorithm to develop an Intrusion Detection System. This method attempts to evolve this Primary Immune Response to a Secondary Immune Response using the concept of memory cells prevalent in Natural Immune Systems. A Genetic Algorithm using genetic operators- selection, cloning, crossover and mutation- facilitates this. Memory cells formed enable faster detection of already encountered attacks. These memory cells, being highly random in nature, are dependent on the evolution of the detectors and guarantee greater immunity from anomalies and attacks. The fact that the whole procedure is enveloped in the concepts of Approximate Binding and Memory Cells of lightweight of Natural Immune Systems makes this system reliable, robust and quick responding.
  17. An Iterative Algorithm for Microwave Tomography Using Modified Gauss-Newton Method (with A.K. Kundu, B. Bandyopadhyay, University of Calcutta, India) An inverse iterative algorithm for microwave imaging based on moment method solution is presented here. A modified Gauss-Newton method has been depicted here to address the nonlinear ill-posed problem. The stabilization term consists of a combination of three weighted discrete derivative operators instead of an Identity matrix as in the Levenberg-Marquardt method based algorithm developed by us. The present algorithm shows a marked improvement over the previous one in the quality of the reconstructed images from synthetic data under noisy condition.
  18. Development of Tools for Accessibility and Mobility for the Visually Impaired Persons (with Variable Energy Cyclotron Centre, Department of Atomic Energy, Kolkata; Webel Mediatronics Limited, Government of West Bengal): This work allows a visually impaired person to access websites with the help of text to speech. Links will be read out indicating that this is a hyperlink. Static texts also will be read out. A screen reader having voice command interface will allow visually impaired users to control the application through voice commands. The project also envisages development of a mobility-aid device for visually impaired persons. The device will be developed using ultrasound sensors. It will be body worn or hand-held. Audio beeps of different frequencies will be generated based on different obstacles. One sensor shall be focused on the floor / path. The other shall recognize environmental obstacles in front.
  19. Development of Audio, Text and Braille Material Delivery System for Visually Impaired Persons (with Media Lab Asia, Ministry of Communication and Information Technology, Government of India; Webel Mediatronics Limited, Government of West Bengal) Delivery system of audio 'reading' material and e-Text reading material through screen reading software for the visually impaired persons is being developed. It is very difficult to 'read' a large book in recorded audio or e-Text format without the facility of 'browsing' i.e. presentation of a document in hierarchical web style non-linear format is very important to make it accessible by visually impaired persons. We are developing a hand-held-device with that will play audio and e-Text files kept in such format. A web site will host large number of audio and e-Text books. Content include text books, novels, newspaper summaries, journals, articles, etc. The hand-held-device shall be used for offline listening / reading. Internet radio will be used for transmitting information of interest for blind community. It will involve a streaming medium that presents listeners with a continuous "stream" of audio.
  20. A Very Simple Approach for 3-D to 2-D Mapping (with Sandipan Dey, Anshin Software, Kolkata; Ajith Abraham, Norwegian University of Science and Technology, Norway) Many times we need to plot 3-D functions e.g., in many scientific experiments. To plot this 3-D functions on 2-D screen it requires some kind of mapping. Though OpenGL, DirectX etc 3-D rendering libraries have made this job very simple, still these libraries come with many complex pre-operations that are simply not intended. Also to integrate these libraries with any kind of system is often a tough trial. This paper presents a very simple method of mapping from 3-D to 2-D, that is free from any complex pre-operation and it will also work with any graphics system where we have some primitive 2-D graphics function. Also, we discuss the inverse transform and how to do basic computer graphics transformations using our coordinate mapping system.
  21. Impact of Node Mobility on MANET Routing Protocols Models (with Bhavyesh Divecha, IIM, Kolkata; Ajith Abraham, Crina Grosan, Norwegian University of Science and Technology, Norway) A Mobile Ad-Hoc Network (MANET) is a self-configuring network of mobile nodes connected by wireless links to form an arbitrary topology without the use of existing infrastructure. In this paper, we have studied the effects of various mobility models on the performance of two routing protocols, Dynamic Source Routing (DSR-Reactive Protocol) and Destination-Sequenced Distance-Vector (DSDV-Proactive Protocol). For experiment purposes, we have considered four mobility scenarios: Random Waypoint, Group Mobility, Freeway and Manhattan models. These four Mobility Models are selected to represent possibility of practical application in future. Performance comparison has also been conducted across varying node densities and number of hops. Experiment results illustrate that performance of the routing protocol varies across different mobility models, node densities and length of data paths.
  22. An Overview of the Evolutionary Trends in Molecular Computing using DNA (with Abhinav Maurya, Anu Nair, University of Mumbai) This work addresses usage of the biochemical molecule of DNA to solve computational problems. DNA computers use strands of DNA to perform computing operations. The computer consists of two types of strands - the instruction strands and the input data strands. The instruction strands splice together the input data strands to generate the desired output data strand. Since the instruction strands are generic nucleotide codes for the various operations, DNA computers meet the accepted standard for being classed as true computers according to the Turing machine concept. A strong point in favor of DNA computing is the potential storage capacity of DNA, which far exceeds that of the most advanced silicon storage devices. A major challenge that DNA circuit engineers face is the difficulty of predicting circuit performance at the design stage, with the consequence that actual construction requires significant experimental effort, even for very simple circuits.
  23. ACRR: Ad-hoc On-Demand Distance Vector Routing with Controlled Route Requests (with Jayesh Kataria, University of Mumbai; P.S. Dhekne, BARC) Reactive routing protocols like Ad-hoc On-Demand Distance Vector Routing (AODV) and Dynamic Source Routing in Ad-Hoc Wireless Networks (DSR) which are used in Mobile and Ad-hoc Networks (MANETs) work by flooding the network with control packets. There is generally a limit on the number of these packets that can be generated or forwarded. But a malicious node can disregard this limit and flood the network with fake control packets. These packets hog the limited bandwidth and processing power of genuine nodes in the network while being forwarded. Due to this, genuine route requests suffer and many routes either do not get a chance to materialize or they end up being longer than otherwise. In this paper we propose a non cryptographic solution to the above problem and prove its efficiency by means of simulation.
  24. Mobile Ad Hoc Network Security Vulnerabilities (with Animesh K. Trivedi, Rajan Arora, Rishi Kapoor, Sudip Sanyal, IIIT, Allahabad; Ajith Abraham, Norwegian University of Science and Technology, Norway) The routing protocols used in the current generation of mobile ad hoc networks, are based on the principle that all nodes will cooperate. However any node could misbehave. Misbehavior means deviation from regular routing and forwarding protocol assumption. It may arise for several reasons, unintentionally when a node is faulty or intentionally when a node may want to save its resources. To save battery, bandwidth, and processing power, nodes should not forward packets for others. Without any counter policy, the effects of misbehavior have been shown to dramatically decrease network performance. Depending on the proportion of misbehaving nodes and their strategies, network throughput could decrease, and there could be packet losses, denial of service or network portioning. These detrimental effects of misbehavior can endanger the entire network. We deliberate on all these issues in our work.
  25. An LSB Data Hiding Technique Using Natural Numbers (with Sandipan Dey, Anshin Software, Kolkata; Ajith Abraham, Norwegian University of Science and Technology, Norway) In this work, a novel data hiding technique is proposed, as an improvement over earlier work in this field. First we mathematically model and generalize our approach. Then we propose our novel technique, based on decomposition of a number (pixel-value) as sum of natural numbers. The particular representation generates a different set of (virtual) bit-planes altogether, suitable for embedding purposes. They not only allow one to embed secret message in higher bit-planes but also do it without much distortion and in a reliable and secured manner, guaranteeing efficient retrieval of secret message. Analysis indicates that image quality of the stego-image hidden by the technique using the natural number decomposition technique improves drastically against that using prime & Fibonacci decomposition techniques. Experimental results show that stego-image is visually indistinguishable from cover-image.
  26. Whole Genome Comparison on a Network of Workstations (with Marcin Paprzycki, SWPS and IBS PAN, Warsaw, Poland; Rajan Arora, IIIT, Allahabad; Maria Ganzha, EUH-E, Elblag and IBS PAN, Warsaw, Poland) Whole genome comparison consists of comparing or aligning genome sequences with a goal of finding similarities between them. Previously we have shown how SIMD Extensions used in Intel processors can be used to efficiently implement the, genome comparing, Smith-Waterman algorithm. Here we present distributed version of that algorithm. We show that on somewhat outdated hardware we can achieve speeds upwards of 8000 MCUPS; one of the fastest implementations of the Smith-Waterman algorithm.
  27. Applying SIMD Approach to Whole Genome Comparison on Commodity Hardware (with Marcin Paprzycki, SWPS and IBS PAN, Warsaw, Poland; Rajan Arora, IIIT, Allahabad; Maria Ganzha, EUH-E, Elblag and IBS PAN, Warsaw, Poland) Whole genome comparison consists of comparing or aligning two genome sequences in hope that analogous functional or physical characteristics may be observed. In this work, we present an efficient version of Smith-Waterman algorithm, for which we utilize sub-word parallelism to speedup sequence to sequence comparison utilizing the Streaming SIMD Extensions (SSE) on Intel Pentium processors. We compare two approaches, one requiring explicit data dependency handling and the other built to automatically handle dependencies. We achieve a speedup of 10-30 and establish the optimum conditions for each approach.

Prior Work

  1. Summary of Work done in the area of Network, Security, Parallel Processing and Computer Algorithms[details]

    I have worked and guided work in the area of Parallel Processing, Computer Algorithms, Ad Hoc Network protocol improvement under malicious node attack, Intrusion Detection system, spam-filters, protocol to counter on-line dictionary attacks, One time password system, redundant cluster-based system for ad hoc network, Grid security, multi-path highly reliable data transmission, multilevel adaptive Intrusion Detection system, distributed Highly reliable and available Web-server cluster. Research work has also been done in the area of:  Theory and Design of t-Unidirectional Error Correcting and d-Unidirectional Error Detecting Code, and for Fault-Tolerance in Hypercube-based Parallel Architecture. [1994- till date] 

  2. Speech Research and System Development [details]

    I was the Co-Principal Investigator (Co-PI) of the Voice Oriented Interactive Computing Environment (VOICE) project, undertaken as a part of Knowledge Based Computer System (KBCS) Project, Phase II, under the aegis of (the then) Department of Electronics (DOE) from the time of initiation (1995) and was the Principal Investigator (PI) from 1998 till the end of the project (2000). Speech recognition and synthesis systems for Hindi were developed under the project with application prototypes for speech-activated travel guide. A Hindi speech database was also designed developed for speech research community. I have research publications in  speech recognition and synthesis. 

  3. Projects of National Importance: [details]
    1. Integrated Data Handling System for On-line Air-Sector Control

      This project (1970-1984) got successfully completed in 1984 through all its phases. Did a fair stint of Design and Development work from 1973 to 1976 and then again from 1979 to 1984. Was in complete charge of this project for the hardware design, development, maintenance and debugging of the system throughout the software development stage from ’79 to ’84. This was a tightly-coupled multi-computer system, designed for Air Sector Control and all the associated data handling. M/s ECIL produced these multi-crore (each) rupees worth systems and 25+ systems have been installed in the fields so far. Was co-recipient of VASVIK award for this work. [1973-1984]

    2. Processor Design and Development

      A 16-bit Highly Reliable Computer System was designed, developed and tested completely with Peripheral Devices. The Processor acted as the Central Controlling Element for a rugged mobile exchange of the high Availability Computer System class. The basic development of this microprogrammed computer around Bit-Slice microprocessors was completed in a relatively short time (September 1, 1977 – December 27, 1977). Rugged production units were successfully field-tested.  Know-how was passed on to Indian Telephone Industries (ITI), Tata Electric Company (Research & Development) and  Bharat Dynamics Ltd. [1977-1980] Some academic fallout of the processor design:
      1. Single Error Correcting Partial Double Error Detecting Code
        Famous Single Error Correcting hamming Code was extended. Single Error Correcting Partial Double Error Detecting (SEC-PDED)  code having the same number of check-bits as the Hamming SEC Code was developed. This corrects all Single Bit Error and in addition detects a high fraction of all possible Double Errors (n-choose-2: for a (n,k) code where n is the sum of d data bits and k code bits. In other words, SEC-PDED  corrects less number of false Single Errors than Hamming SEC Code. Extra hardware needed for this partial double error detection is minimal, a 10-input NAND gate for a (21, 16) SEC-PDED code. For a 60 bit machine (CDC Cyber 170), the PDED efficiency achieved is 90%+ with only 7 check-bits.  This code was implemented by Company Elettronica `MAEL’, Carsoli (Aquila), Italy for a computer memory system. (1978)
      2. An Algorithm for Non-Restoring Division was developed. This published algorithm was used by M/S Motorola for their 6802 Microprocessor. This algorithm was also used in TIFR and in IIT, Mumbai for microprogrammed Processor design. (1977)

    3. 3-D Scanning Radar Attachment Project

      A Computerized system was developed which was connected to the 3-D Scanning Radars of the Cyclone Warning Radar Station of the Indian Meteorological Department at Madras (Chennai). The radar signal was pre-processed by Digital Video Integrating Processor (DVIP), designed and developed at TIFR. Output of the DVIP was fed to the processor, developed by me, for further processing and generation of “rainfall map” to be displayed on a video monitor. [ 1978-1979]