NASSCOM Product Conclave

Over the past few years, startup culture started finally taking shape in Pune. Many startup related events keep happening. Accordingly to one estimate, there are about 400 startups in Pune and the new debate has started occurring whether Pune is new Bengaluru. With startup culture, the product culture also is on the rise. We have been having  Pune Connect events, particularly, promoting this product culture. I had attended there 2012 edition. NASSCOM also has been having Product Conclave sessions, in Pune, since last four years. I finally managed to attend this time. Here is a blog on what I experienced.

The venue was The Westin hotel on east side of Pune. Conferences like these typically have many sessions, and many of them run parallel. So one needs to really decide on what one wants to attend to get most of out of it. These days apps and websites such as sched.com come handy. I also had my schedule chalked out before hand using this app. I reached venue just before 9 am, that is when first keynote was to begin. We had Dr Ganesh Natrajan speak to begin with.  He is pleasure to listen to always. He walked us through NASSCOM’s journey into startups and products and how it has recently launched Startup Warehouse initiative. He also, obviously, touched upon his favorite topic-smart city initiatives for Pune. After him, Ashok Soota spoke on business strategies. This is the first time I heard him. He is a founder famous IT services company MindTree. And now, at age of 69, he has started new company called Happiest Minds. He also spoke about his book Entrepreneurship Simplified. Then followed a session on success stories of entrepreneurs from non-Metro cities of Maharashtra such as Jalgaon, Satara, Kolhapur etc. I was particularly impressed by Aurangabad’s Prashant Deshpande’s company Expert Global Solutions, with its growth despite operating from tier 2 city.

20170318_090417

After this, many sessions happened in parallel. You can find details on those here. I will describe what I attended.

I skipped many of those sessions related to, usual buzzwords such as, IoT, analytics, blockchain etc. First I walked into a session titled “User Research: Transforming your startup idea into viable product”. This was about, needless to say, user research aspects while designing a product. This is pertains to UX or user experience.Typically, it is confused with market research which is different. User research focuses on how users will use the product, their needs, behavioral patterns directly supplying inputs to UX design. I also attended partially, a session titled “Cyber Security: Mitigating DDoS” which was related to getting under the hood of one of the significant threats called Distributed Denial of Service(DDoS). It provided interesting insights into grey world of hackers causing these attacks and also some tools and techniques on mitigating them.

At lunch, took a walk around product showcase isle. There I met my old time classmate Sandeep Tidke who is running a company called LabJump. It was interesting business idea, which is working due to proliferation of virtual training lab requirements. It is pretty evident as Google recently acquired Qwiklabs recently, which was doing similar stuff. I also bumped on another acquaintance of mine named Abhijit Joshi, whose company WhiteHedge is engaged into providing Docker as a Service by partnering with Docker in India.

Later in the day, I walked into a session around design thinking. It was titled Future-proofing Product Innovation using Design Thinking. This was by Manoj Kothari of Turian Labs. Despite being a post lunch session, it kept me glued to the seat for 2 hours in this session. He also stressed on aspect of showing empathy towards your users to understand more about their pain-points. I walked out with thought of applying some of the design thinking principles in what we do at work. I am big fan of innovation and ideation. And design thinking helps to break the conventional thought. Then I went to a session what turned out to be a treat to ears. This was by Anshoo Gaur(an investor himself with his own VC firm Pravegaa) and it was titled How to judge the performance / potential of your startup. He began asking very fundamental question and drew attention to the fact that there are enough problems in the country to go after, versus, what current startup fraternity is busy with. He also cautioned on “Me-too” mentality of entrepreneurs, instead asked to focus on adding value to ecosystem. He also emphasized profitability before achieving scale and did not want scale for profitability. He touched upon metrics to track within a startup by drawing analogy of machine with an organization. 

As a last session for the day, I had decided to attend ANTIGyan which was much hyped in the morning as one session not be missed. And it turned out to real entertainer. This was by Ajeet Khurana, an investor himself, threw open issues with startup ecosystem, gaps in a lighter vein, by breaking conventional thoughts and hence causing anti-gyan(similar to anti-pattern). Sighting the buzz around products since last few years, which is a welcome shift from IT services for India, his wisdom pitch was that, ultimately, businesses are funded, not product.

The event saw more than 600 participants and was full of buzz. Organizations such as NASSCOM, TiE have succeeded in acting catalysts to growing melting pot of startups which is so nice to experience.

Advertisements

Moved to AWS? What next?

This year’s AWS re:Invent, week long annual mecca of AWS cloud users, professionals concluded recently in Las Vegas. Based on the reports, it seems that this event is getting bigger and bigger every year. I heard there were more than 30K people this year! I cannot wait for similar event locally here in Pune to take place, which typically follows few months later. I have attended local events in the past and written about it here. The sheer speed at which newer and newer features are rolled out is amazing and mind boggling. This true with Google and Microsoft’s Azure as well. AWS even announced region(Mumbai) in India as well, which was long standing requirement considering the growth of startup culture in last few years in India. Yesterday, I noticed a Google Maps advertisement(#LookBeforeYouTravel) billboards on the streets of Pune, which was, obviously, encouraging commuters to use Google Maps to combat ever-growing, ever-chaotic traffic situation in Pune. This is the first time I have seen Google getting on the streets to advertise. Neither have I seen similar thing from AWS as yet. Of course, we see Amazon’s online store’s advertisements all over, all the time.

At AWS re:Invent, some of the announcements are understandably around ever growing IoT platform, and data analytical/artificial intelligence. I was particularly interested in hearing James Hamilton, AWS’ distinguished infrastructure engineer. I am following his blog for over an year now, since I came to know about it. I always wondered what is behind AWS infrastructure. You may find his blog here and also points he made during his talk here. The proliferation of public cloud platforms such as AWS has made enterprises rethink their IT strategy. And why not? Cost saving, one of the biggest drivers, at different levels is always a welcome thing. Many of them decide and eventually move to AWS, and that is where new challenges emerge. Even though most of enterprises adopt Hybrid IT/Hybrid Cloud strategy where they choose public cloud for workloads which would result into lower TCO. But ultimately these are enterprise workloads, many a times connected back to on-prem IT systems or even some other private cloud or hosted private cloud ecosystems. This results into certain challenges which enterprise customers do need help on. Recently, I wrote a blog on Sungard Availability Services'(Sungard AS) corporate blog website, about these and how we at Sungard AS are trying to address them. Hope you will find it useful.

Anyways, let me close this blog on a wacky(but a futuristic) note. Considering the sheer growth of AWS, Google, and Azure(AWS is doing more than USD 2 billion per quarter in revenue, others probably similar, but have not disclosed), I wonder if they will have presence in outer space too. How about a AWS data center on Moon or in outer space?

Update on Dec 31, 2016: I did not imagine that my wacky thought of Amazon having a data center in space would realize so soon. Today, I read about the fact that Amazon is planning to setup a fulfillment center in space with a spacecraft flying at at the height of 45,000 feet. This is not a data center, I understand. But I think it is certainly a very close to getting on now that they have a fulfillment center, which not only stores products, but also has drones and Unmanned Ariel vehicle(UAV) delivering ordered goods to customers. See more here.

S R Ranganathan, Colon Classification

I am avid radio listener. I am generally hooked on the state run(Prasar Bharati) radio channels under All India Radio, in city of Pune. It provides, besides entertainment, many informative programs, which surprise me again and again. This week, it ran a short program on life of S R Ranganathan who was pioneer of library and information science in India. The occasion was that his 125th birth anniversary year was to start on Aug 12. After listening this program, I went down the memory lane, about my encounter with his book on colon classification.

This was in the year 2007. I was working at Saba Software, Human Capital Development and Management(HCDM) software company. I was handling project related localization(L10N) and internationalization(I18N) of their software. This project gave me world exposure to world of languages, one my favorite subjects, further, from technical perspective. We happened to get a consultant from Canada, Steven Forth, travel to Pune to work with me to define strategy on L10N and I18N. During discussions with him, I came across words such as taxonomy and ontology which are related to categorization and classification. Incidentally, around the same time, due to my association with Indology course, and subsequent exploration in Indian philosophical systems, particularly, Nyaya and Vaisesika Darshan(also called Indian Logic), I had encountered similar terminologies.

Steven Forth, being avid reader himself, ventured into Pune city for shopping for books. I accompanied him and we went to then landmark Maney’s book stall(which is now closed). Among other books, he also bought S R Ranganathan’s book titled Colon Classification. During my discussions with him about that book, I was surprised to know about this Indian mathematician and information scientist, who has done such a pioneering work, of which, I was completely unaware of it. All of his work, was achieved way back in 1950s and even earlier. I am sure many of us are not aware of it. Subsequently, I also bought that book.  He is considered as father of library science in India, also rest of the world. His birthday(Aug 12) is observed as National Library Day in India. Subsequently, I learnt that his thoughts around classification came from concepts of classification and world view of Nyaya and Vaisesika Darshan. He also was instrumental formulating five laws of library science which are:

Law#1: Books are for use.
Law#2: Every reader his / her book.
Law#3: Every book its reader.
Law#4: Save the time of the reader.
Law#5: The library is a growing organism.

I am particularly fascinated by his thinking on information classification. I am still exploring this field. I am sure many of these concepts are useful in this age of big data analytics. His information retrieval concepts might be relevant for digital age of today where there is explosion of data. They all apply principles of information science. I intend to write a series of blogs here on his work subsequently. Today I wanted to introduce about this lesser known personality(outside small circle of library science fraternity) from India and his pioneering work on the occasion of his 125th birthday.

Computers-then and now: Part#1

Other day, I happened to go for an evening walk after long time. Usually, I walk in the morning. During that evening walk, I met a gentleman who is now living his retirement life.  He used to work for weather office. After the formal exchange of pleasantries, he asked me as to what I was up to, to which I replied that I am into cloud technology these days. His curiosity was aroused and asked me seemingly a naive question as to what is this cloud technology. I always find it fascinating to talk about computers and related technology to lay-people in the language they can understand and relate. But here the case was different. I knew that he was programming in FORTRAN as part of one of his responsibilities in weather office, and he had seen earliest of the computers till the evolution of personal computers and to some extent Internet. So with all that in my mind, I explained him about cloud in the way he can relate and understand, and he was able to relate and understand. We went on talking of evolution of computers in general and he fondly went down the memory lane of his early days with computers, about which I encouraged him write about.

This incident prompted me to think of putting some of that evolution the way I have seen it. Anyone who has spent a while in technology space, is sure to be nostalgic about how technology has changed and meta-morphed over time. I know I am using a medium of blog. to talk about it. I don’t intend to capture all my nostalgia in one blog. This might need a series of blogs. At the same time, I don’t want to sound it like history of computing in India here, for, that is already captured at number of other places in detail.

My association with computing and technology in general started in 1986. That’s when I enrolled myself into undergraduate program in computer science. Ours was first batch. The lab was boot-strapping. It was hosted temporarily in the labs of Physics department! Our lab initially consisted of only one IBM compatible personal computer with horizontal CPU box, and a CRT monitor sitting on top of it. The CPU having a 5 1/4 inch floppy drive. The whole machinery was running MS-DOS. I believe it has 640 KB RAM, and I don’t remember the RAM size. And I believe it was running Intel 8086 processor. No Internet, no printer, no web cam, no mouse! We learnt COBOL programming(Micro Focus Cobol), Turbo Pascal and Turbo C programming on it. I still remember using Amkette and Verbatim floppy disks for storing data and our programming assignments. Yes, I agree, my association did not begin with punch cards, or those large mainframe computers, or even with operating systems such as CP/M. On the digital electronics side, I remember learning programming 8085 microprocessor and peripheral devices, on an educational kit. and 8051 micro-controller that time.

We got dBase III in 1989 and used that for our project. I remember our computers being hit by a virus called C-Brain that year. We also connected two computers using serial connector and carried our experiments on data communications using RS-232 protocol by programming over serial port. No LAN as yet. I also saw dot matrix printer also for the first time that year. This one was from Epson. We also got propitiatory Unix system from Zenith, if I remember correctly, that year. It might have been Xenix-a variant of Unix. There were about 6-8 terminals connected to one Unix CPU box which was in a separate area called server room. BTW, the lab also got air-conditioner, mainly for that server room. That is where we did our assignments on Lex and YACC. That Unix box had DBMS called Unify.

Later that year, I took a job as an instructor at computer training institute where I saw 20-25 computers for the first time in my life. All running still MS-DOS, no LAN, with floppy drive. I also saw word processing software called WordStar, and also spreadsheet software called Lotus 123. I also started developing menu driven applications using Turbo C, dBase III, FoxBase, Clipper on console based monitors. No MS-Windows yet for me though first few versions were out. Though I remember having a glance at Apple Mac machine at a local Mac shop in Pune early 1990, about which I have written here.

In the next blog in this series, I will write about my introduction to Apple Macs and also MS-Windows and yes those 3 1/2 inch floppy drives. Some evolution here! This was around 1991. Stay tuned till then!

 

AWS Enterprise Summit in Pune

AWS as a cloud provider has grown enormously last few years. My interest has grown in it off late, since at Sungard AS, where I work, we build various solutions and services on AWS catering enterprise customers. This was the first enterprise summit by AWS in Pune. I was not quite excited when I looked at their agenda for the summit, except one or two sessions which looked closer. But nonetheless, I registered for it and this blog is quick report of what I saw there.

The conference was in Le Meridian hotel in Pune today(June 25, 2015). I decided to take public transport(our beloved PMPML bus service in Pune) instead of driving down. I have been sick and tired of driving down to work in messy traffic everyday. Traveling via public transport was not all that great experience. I don’t want to go on rambling about it here(but interested and curious ones can look at it here). I reached the venue on time, got myself through the formalities. The setting was nice inside. I soon found myself on the demo booths setup by conference sponsors(which among few vendors included a Try and Ask booth, AWS Startup Ecosystem promotion booth)

Amidst the full house in the main hall(close to 350 people rubbing their shoulders :-)), the keynote began with AWS India head Bikram Singh with participation from BMC Software, Tata Motors and Sokrati.  Bikram went over history, and threw numbers depicting AWS size and stressed how cloud is new normal and how cloud has crossed chasm by going beyond Gartner’s technology hyper cycle. BMC’s Suhas Kelkar talked about BMC’s AWS offerings for cloud life-cycle management(CLM), while Tata Motors CIO talked about their foray into cloud using AWS, and how are they reaping benefits on innovation without having a dead-weight(in terms on-premise infra), and Sokrati(which represented startup on AWS) spoke about how they are exploiting various analytics services from AWS.

After which there was a talk by Intel which happened to be platinum partner of the event. But his talk did not bring any relevant information and was not crisp. The morning session ended to with panel discussion which wasn’t impressive as well.

Post lunch where break-out sessions were planned in two parallel tracks. The original agenda included: Hybrid Infrastructure Integration, Architecting for Greater Security, Building Mobile Apps on AWS, HPC/Big Data on AWS, DR of on-premises IT infrastructure with AWS, DevOps – Transforming Software Development, Digital Media & Entertainment Workloads on AWS and Microsoft/SAP Workloads on AWS. We found break-out sessions were looking very different.

Anyways, I managed to attend three of those break-out sessions. The first one was on Big Data on AWS. This session went over rising need of big data applications, how it is all around us and how AWS is ready with battle-hardened technologies for the same in terms of Redshift and EMR. The second one was on Microsoft/SAP workloads on AWS, which went over support in AWS and key architectural considerations. The third one was on hybrid environment where in on-premises infra exists with cloud infra. It went over the support and considerations for setting up such environment with help of AWS. These sessions were not deep enough and did not go beyond standard information available on AWS websites, and left me wanted for deeper insights.

All in all, the summit was not quite a great experience for me at least. I kept wondering myself as to what was my takeaway from the summit while I was riding the bus back to home.

Object Databases to NoSQL and to Cloud Databases

Software industry is characterized by changes-rapid changes. New technologies get introduced on regular basis. It is also characterized by technologies getting morphed and present themselves in new avatar or get re-purposed.

Take example of object databases. Back in 1980, objects database management systems(ODBMS) were considered serious contenders to relational alternatives. They out-performed RDBMSes in many situations. It allowed application developers think in terms of objects end-to-end, without any impedance mismatch. They alleviate costly joins processing and provide consistent persistence. Many companies existed and still exist even today. I used to work one such company called Versant. In fact, before that, I worked for another company called MediaDB which was based on ODBMS, but was built and optimized for storing multi-media objects. Versant, was and is general purpose ODBMS like Objectivity, O2, Poet, GemStone, etc.

ODBMSes built their businesses by providing data management solutions for complex application having complex object graphs and need to treat them end-to-end that way. ODBMSes became prevalent in niche market and were never able to make to big league for variety of reasons. They often struggled to justify their existence. Then came two waves of technologies which made them relevant once again. One of them is NoSQL wave. And the other is Cloud wave.

With proliferation of Internet and web based applications, need of managing unstructured data became need of the hour. Big data analytics added further spice to how data from variety of sources is treated, managed. SQL was no more perfect for such applications, hence many open source initiatives brought NoSQL movement and came up with data stores with based on non-relational concepts. ODBMS vendors quietly watched this revolution. ODBMSes were the first NoSQL databases. As NoSQL databases became more prevalent, ODBMS vendors meta-morphed themselves and requisitioned to ride on NoSQL movement. Look at Versant’s website to see the case in point. They, of course, got acquired by Actian recently. But this re-positioning of Versant began much before acquisition, via technologies such as Versant JPA.

With proliferation of cloud and wider adoption of cloud, along with new age applications such as big data analytics, ODBMSes are well positioned to apply themselves to these needs and to the scale of cloud. Many cloud vendors do provide object storage capabilities such as Amazon/AWS S3 or even open source Object Storage initiatives such as Ceph or Swift which are can be used with OpenStack or CloudStack. But they are not ODBMSes. They are raw object storage mechanisms and very useful. To cater the needs of second wave, ODBMSes have started to position themselves for cloud. Cloud computing means scale, high volumes, distributed databases, and typical applications are analytical in nature than more of transactional. For ODBMSes, there are few facets of cloud database which needs to be still taken care of such as Database as a Service(DBaaS), high availability(HA) from cloud ecosystem perspective. We will see these gaps from cloud support perspective, being addressed in coming years.

CEP/ESP Technology, IoT and Cloud

When I got invite for PUG Reboot event this weekend, I registered myself quickly-mainly because I was interested in 2 of the 4 sessions of track#1. Track#2 was focused Visual Studio and related topics. Track#1 has sessions involving Azure. The two sessions I was interested in learning more about DevOps kind of features using PowerShell workflow introduced in Azure recently. And the other was about certain new features around supporting IoT and complex event processing(CEP)/event stream processing(ESP) applications. I had particular interest in understanding more on the later topic because of my stint on CEP/ESP sometime back using open source technologies.

The first session was on Azure as IaaS platform which was essentially an intro to many features of a typical IaaS platform and its capabilities. The second one was on Azure automation, PowerShell and desired state configuration. It talked about PowerShell Workflow feature, to automate many DevOps tasks for your application hosted on Azure. The similar features have been available on Puppet and similar technologies for some time now. The advantage of this feature in Azure over Puppet kind of tools is that, it not only allows automation configuration management related use cases but also monitoring, log processing etc.

The third session was of my most interest. The topic was on Azure’s support for CEP/ESP kind of applications(which are primarily arising due to IoT possibilities). During my consulting assignments few years back, I was involved in a project where we were looking to develop CEP/ESP platform for water utilities domain where in IoT kind of applications such as intelligent sludge management, waste and waste water treatment plant management, intelligent and predictive operations for water utilities companies.The key thing in such applications is huge number of events which is generated by various sources which needs to be analyzed, processed and used for patterns detection, actionable analytics. So the velocity, variety and volume of events is under consideration in such cases. The IoT is involved here in a sense, as events are typically generated by telemetry devices. For example, in water utilities domain, sensors, flow meters, GPS devices on fleet of trucks moving sludge, sensors and other telemetry devices in process equipment in treatment plants etc. The infrastructure required for processing large volume of events was least of the concerns during our initial prototyping and design. We were more focused on event gathering, pre-processing, processing using tools such as Esper/Drools, machine learning algorithms for analytics and visualization framework. What cloud service providers such as Azure have done is they have solved problem of infrastructure and ecosystem required for ultra-fast processing of events by way of providing services such as Event Hub and Stream Analytics services. Of course, storage of events data for archival and historical trending is anyways part of any cloud infrastructure. All in all, this make much more attractive from time to market perspective on IoT applications such as what I was talking about water utilities domain. This is shown in below diagram from Azure documentation.

azure-eventhub

Now IoT application developers can focus more on processing, analytics logic part of the puzzle. The possibilities are immense using modelling, simulation, optimization and combining that with big data analytics techniques for business benefits.

A side note: It is interesting to note how technologies involve and also morph under a new terminologies and names. In the past we used to have data acquisition applications, SCADA to handle, and derive actionable insights from data generated by various sensors on a manufacturing plant or any other industrial setting.  This is now called at Industrial IoT(IIoT), of course, now some data is coming from Internet enabled devices(not necessarily from one manufacturing plant).

Update on Azure: Got to read just now, that Microsoft is acquiring a data science company(which produced commercial versions of software used for statistical programming called R) to further its machine learning features on Azure.