Monday, 2 June 2014

Pilgrimage To Mt Kailash & Manasarovar

This is a guest post by D.V. Kulkarni, who had visited Kailash Manasarovar once in 2011 and is visiting the same in August 2014 via Uttarakhand through MEA conducted tour.




OM NAMAH SHIVAYA

In Sep 2009, I attended a talk about yatra to Mt Kailash & Manasarovar. Somewhere deep within me, I had the desire to undertake this yatra but this desire had been castled in the hustle-bustle of this mayic world. This talk ignited the spark in me and I decided to go on this pilgrimage. In 2010, somehow, my plan of visiting this holy place fizzled out. Finally in 2011, my dream materialized. I booked for yatra through one of the Tours & Travels. 

I started preparation for the yatra in earnest. I had developed many irritating & pain-giving problems due to the hectic and over-stressed life after retirement. As the due date- 08 Jun 2011- of departure started approaching, I started having different acute problems. Probably, these were the tests to judge my resolve. Anyway, I remained undeterred. I did not have to make many purchases, but to take out the things which were not being used. One of the items was my pair of old leather hand-gloves which was purchased in 1980 for the MC-ride to McLeodganj in HP. This area is inhabitated by Tibetans. Now I was going to use the same gloves for visiting Tibet!

We, 12 pilgrims, left Nashik on 08th Jun and reached Tribhuvan International Airport of Kathmandu at 1500 h on 09th. We were accommodated in the hotel Marshyangdi (It is the name of a river in Nepal.) in Thamel- a popular tourist hub. In the evening we had detailed briefing by the proprietor of Samrat Tours & Travels Pvt Ltd who was associate travel agent to organize the yatra. The other group of 30 pilgrims from Bangalore also joined Samrat Tours. So ours was the group of total 42 pilgrims.

10th Jun was free for us. I took this opportunity to take one-hour ‘Mountain Flight’, in the morning, in 19-seater aircraft of Buddha Air. Every passenger was called in the cockpit, turn by turn, and was shown different mountain peaks including Mt Everest, Gaurishanker & Shisha Pangma. One gets to see the Himalayan landscape and glaciers during the flight and, of course, a beautiful aerial view of Kathmandu. Buddha Air gave us a photograph of Mt Everest which had the slogan: “I did not climb Mt Everest… but touched it with my heart!”

Later we visited different temples of Kathmandu- Pashupatinath, Budhanilkantha (Sleeping Vishnu) & Swoyambhunath Stupa. Lastly we made last-minute purchases for the trip. I purchased ‘paanchhu’- a type of rain-coat- which proved its immense utility in this trip.

On 11th Jun at 0400 h, we had bath and then assembled for prayer. In fact from now onwards wherever we stopped overnight, we always had prayer meetings every morning before embarking on further journey. We got into buses which took us to border town Kodari through the university town of Dhulikhel. After crossing ‘Friendship Bridge’ we were in Tibet (Controlled by China) where we had to undergo security checks & immigration formalities. One peculiar aspect is that, Chinese do not stamp or sign on the passport! Here we got Chinese currency- 1Yuan for Rs 7.3! Here Shanghai Standard Time is in force which is 2 h 45 min ahead of IST. We were allotted vehicles- Toyota Land Cruisers 4500, 4-wheel drive- for onward journey. These allotted vehicles were to remain with us throughout our trip. These vehicles and drivers belonged to some company in Lhasa. Our driver was a happy-go-lucky, energetic, any-music-loving Tibetan whose understanding of Hindi was restricted to “chalo, chalo!” Each vehicle would carry four pilgrims along with back-packs and a Sherpa. These Sherpas were arranged by Samrat Tours & Travels Pvt Ltd. These Sherpas are extraordinary hard workers who never seem to get tired even at high altitudes. As soon as we would reach a place, while we were recouping, these Sherpas would go about putting up kitchen tent, preparing tea and meals. Without them this expedition would have been impossible. We were given one Chinese guide for our convoy. In the evening we reached Nyalam, 12375 ft. Beyond this place one does not see the tree. In pre-yatra briefing, we were told to avoid getting cut or bruised which might not heal quickly & cause discomfort. I kept on brooding over this piece of advice. And lo! My finger was cut seriously while doing shaving in Nyalam. Law of Attraction!

From here onwards we were given half a Diamox tablet twice daily to prevent HAPE. These tablets make urinate more. Also diabetic patients have to be careful as these reduce sugar level. Next day, i.e. 12th Jun was spent in acclimatization. I kept myself busy in walking & hill-climbing. This was the first place where people started losing tempers on flimsy grounds- high altitude effect! From here onwards, numerous other effects of high altitude started manifesting- insomnia, lack of appetite, headache, breathlessness, stomach-ache, loose motions, vomiting & high BP. I was lucky enough not to have any of these effects- thanks to my training! On 13th Jun we reached New Dongba. During our journey, we saw yaks grazing here & there. Yak is the kamdhenu of these people. I saw people playing snooker in cold windy conditions. It is mystery as to how this game took roots in this part of world! 

On 14th we reached Harshu on the bank of Manasarovar, 14950 ft. As per earlier plan we were supposed to have two overnight halts at Saga & Paryang. Change in travel plan gave us two-day stopover at Manasarovar! Blessing in disguise!! This was the place where we got first darshan of Mt Kailash. Fulfillment of long-cherished desire! Unbelievable!! Words fail to describe emotions. Some were prostrated, some were awe-struck, and some were mesmerized. I had tears in my eyes- tears of peace & bliss. Most of us took holy dip in the lake. After taking holy dip, I offered oblation to my ancestors. With the dip in lake, I developed cough which remained with me throughout the rest of my yatra!

Manasarovar is the highest freshwater lake in the world. It is relatively round in shape with a circumference of 88 km. Its depth is 90 m (300 ft) and its surface area is 320 sq km. To the West of Lake Manasarovar is Lake Rakshas Tal. Four legendary rivers originate from this region: (i) Indus also called ‘Sindhu’; (ii) Brahmaputra, Yarlung Tsang-po; (iii) Sutlej; and (iv) Karnali- Ghaghara- largest tributary to the Ganges. So this region is the hydrographic nexus of the Himalaya. As per Hindu theology, Lake Manasarovar is a personification of purity, and one who drinks water from the lake will go to the Abode of Lord Shiva after death. Bathing in the Manasarovar is believed to cleanse all sins committed over even a hundred lifetimes. The lake has a few monasteries on its shores. Gurla Mandhata mountain range nearby gives majestic view.

We stayed in tents. At midnight I came out of tent to urinate and saw that the sky was absolutely clear with moon & stars shining to their glory. Again after a few hours I was compelled to come out of the tent. Now the sky was full of black clouds. High up on the horizon, in the direction of lake, I saw a series of 8-10 bright lights. I tried to analyze the scene; but was beyond my comprehension. Shivering cold forced me to return back to the tent. Next day, i.e. on 15th Jun, I decided to have ‘fast’. I got ready; informed other members and went for walk along the bank of Sarovar. Weather was extremely cold. Walking kept me warm. Weather started to change; sky was getting clear. I sat on a mound for meditation. Experienced absolute peace! Tranquility!! Now it was pretty hot. I collected stones as souvenirs from the bank of Sarovar and returned back to our camping area where havan was going on. In the afternoon, I took all my rudraksha (Rudra means Lord Shiva & Aksha means eye) malas- prayer rosaries- and went to the Lake. I sanctified them by dipping them in the Lake.


Next day, i.e. on 16th Jun, we left for Darchen, 15100 ft. On the way, our group did puja on the bank of Manasarovar. We collected holy water from the Lake. Later, when on move to Darchen, we had close look at Rakshas Tal. The water of this lake is never drunk as it is considered inauspicious. We reached Darchen at 1500 h. Immediately we all went to Ashtapad, meaning eight steps, the place where first Jain Tirthankara Rishabhadev attained nirvana. This place gives a beautiful close view of Mt Kailash. One of our vehicles got stuck in the snow and it took 3 h to clear it. So we could not spend much time here.

On 17th Jun, after having breakfast, we were taken to Yama Dwar by our allotted vehicles. Yama Dwar is a stupa with a doorway in between. Passing through this Dwar, it is believed, is passing from one world to another- a sort of rebirth. Also, going through this Dwar is supposed to remove fear of death. Here we were given walking-sticks which are pretty useful in mountains. This is the start point for trekking around Mt Kailash- known by Buddhists as kora, or outer parikrama. After completing 13 koras one is eligible for inner parikrama called nangkor. Pilgrims of several religions believe that circumambulating Mt Kailash on foot is a holy ritual that will bring good fortune. Parikrama around the holy mountain can also be done on pony or yak. One can engage porter to carry back-pack. I decided to do the kora on foot without a porter. I saw some perform the circumambulation by making full body prostrations (Kyangcha) the entire way. It takes at least four weeks of physical endurance to complete the parikrama. Amazing faith & devotion!    
     
First stretch of about 10 km, takes one to Dirapuk Gompa, 16236 ft where we had night halt. The second stretch of trek of about 22 km is the toughest one. The track passes through the highest point, Dolma La, 18600 ft. Just before the pass, is the Ganesh Kund. Dolma La belongs to Devi Parvati and is an important point of worship. Many a pilgrims sacrifice their prized possessions here. This point is considered as the point of spiritual renaissance. After the pass, is Gauri Kund- bathing place for Goddess Gauri. This is the setting for Lord Ganesha acquiring elephant head. On this leg of our parikrama, we encountered rains & snow.The stretch took us to Zutulpuk Gompa, 15825 ft, for night halt. The last stretch of parikrama of three hours is the easiest one in which one returns back to Darchen. After having lunch here, we immediately embarked on return trip in which we had night halts at Saga & Zangmu. Zangmu, on Tibetan side, 7 km from Friendship Bridge, reminded me of Gangtok. On 22 Jun, we returned back to Kathmandu.

On 23rd Jun, we visited temple of Manakamana Devi, believed to bless her devotees by fulfilling their wishes. It is located on the banks of Trishuli river, 105 km from Kathmandu. One reaches the temple by Cable Car. During the journey & at the Temple, one gets spectacular views of deep valleys, terraced fields and snow-capped mountain ranges.  On return journey to Kathmandu, we encountered landslide which blocked and delayed us by about three hours. As such we could not go to Pashupatinath Temple for Thanks Giving, as was planned earlier. I was feeling extremely upset for missing the last darshan  of Pashupatinath. But lo! Next day- the day of leaving Kathmandu- our flight to Mumbai was delayed by two hours. And  so we could fulfill our desire of visiting Pashupatinath! 


Before the commencement of yatra I had pondered a lot as to what important thing, which affects my life the most, be sacrificed on this pilgrimage! I could not come to any conclusion. So I left this thought. On the last day of the trip, while on visit to Manakamana Temple, I lost my specks. Without specks my activities become restricted. I was upset with my forgetfulness. It struck me that it was the most indispensable thing in my life! I was forced to sacrifice it on this pilgrimage!


Throughout the vehicular journey, road is smooth, black-topped except a few km stretch between Nyalam & Saga. Everywhere the living accommodation was good but toilet-facilities were very primitive. Hotel accommodation in Zangmu was the best in the entire tour.

Pilgrimage to the great sacred Mt Kailash & Manasarovar is a life-changing experience and an opportunity to view some of the most magical scenery. It is a place of Himalayan utopia- SHANGRI-LA. The entire area echoes with spiritual vibrations. Even the agnostics shall change their beliefs amidst that fathomless serenity. One has to experience it. Mt Kailash is treated as a place of eternal bliss. Shiva is the Lord of Yoga. Here He is sitting in a state of perpetual meditation along with his wife, Parvati. I saw people always helping others. Here one realizes that ‘love’ is the essence of life. Though terrain and weather are treacherous; the effect of low temperature is worsened by wind-chill factor; still it is worth visiting this venerated place. What is required is strong desire & bit of willpower !

***

Saturday, 24 May 2014

Top open source projects in Java

Here is what GitHub says (I am surprised to see Spring at #10):
https://github.com/search?l=Java&o=desc&q=stars%3A%3E0&ref=advsearch&s=&type=Repositories







Friday, 9 May 2014

Strategy Guide to Entering into Freelancing

I attended my engineering college's alumni meet last week. Many juniors approached me to know about what needs to be done to work as a freelancer, how to get projects and much more. Even students had these questions. So I'm jotting down some important points to pitch into the competitive field of freelancing and excelling in it. 

Create a profile which is accessible on net

LinkedIn is presently a widely professional networking site. Create a profile and keep it updated. The key features of portfolio should be- summary, experience (projects/paper presentations/technical competitions/conferences) and academic details. Recommendations and endorsements are added advantages.

Create a project/app- should be published on app engine or any of the web stores  

If one is interested in mobile applications development, develop and publish the app on respective store- Andriod or iOS. Similarly for web apps. Also publish the source code on github to showcase coding skills.

Register yourself on freelancing websites

There are multiple websites where one can bid for projects. Some of them are Elance or Freelancer.
Add summary of the points listed and worked upon in the profile. One needs to keep an eye on which projects he is interested or well-versed at. Look for suitable project and apply.

Participate in networking activities

Groups conduct regular meetups. Attending those meetups will help to build social network. Folks belonging to either side of need attend them- those who have projects and want to outsource and the ones who want to get projects to work upon. Try for getting projects from people within network.

Participate in design/development/data sciences contests

Various competitions are held at different levels- local, national, international. These competitions give us a close look at real world problems and getting a solution. Also, we learn to think from problem analysis till getting a working solution.

Contribute to some existing open-source projects

To begin with, add a few projects(usually the apps which you use often) to watch list and raise bugs/issues. Next step is suggesting features for the app.

Friday, 25 April 2014

[Big Data] Apache Kafka - Part I


Huge amount of real-time data is continuously getting generated these days by various sources. We’ve so many examples: Facebook-  Your feed will continuously be populated with newer and newer items, you also have recent activities of your friends listed down which keeps on updating as and when any activity happens. Similarly, the question answer site Quora shows your notifications, answers, upvotes, new questions asked etc. You do not have to click the refresh button to get them. Twitter is another such very good example. On the other hand there are many such applications which want to consume this data. In most of the cases these data consuming apps are not connected to the data producing apps. Since we do not have data producers and consumers under the same umbrella, we need a mechanism which will seamlessly integrate these ends. So producers need not even know who the consumers are. They just have to bother about their work of pushing messages to a system as and when generated. The generated data is Big Data in present time. We have already got familiar with Big Data in our previous post, we also know its characteristics- volume, velocity and variety, as well as its importance. Size of the data generated poses a big challenge in this integration system. In most of the cases it is not just about consuming the data, but also performing analytics on it. Real-time analytics on huge amount of data to produce real-time outputs is something that has to be catered to. Yes, there are some systems which do not need the real-time data. When they want to consume data, they get connected, get the data generated till that time, go back offline again and then perform analytics on the data.

Kafka is the intermediate system between the producers and consumers which seamlessly allows different kinds of applications to consume messages. It is a publish-subscribe commit log system. It is designed to process real-time data stream activity like news feed and logs.

It was developed at LinkedIn and later open-sourced. Need- since LinkedIn had to deal with so many events e.g. Updates, user activity. With low latency.

Kafka is a distributed, partitioned system. Logs are saved under various 'topics'. For each topic, Kafka saves messages in partitions with the intention of scaling, fault-tolerance and parallel consumption. Each partition is ordered, immutable sequence of messages which keep on adding to the log. The log is saved for a predefined amount of time. Consumers can subscribe to multiple topics. Messages are stored in order and each message has got a sequential id. There is 'offset' of each consumer. Offset shifts with consumption of messages. Usually a consumer will consume messages in order.

kafka_topicPartition.png

A server in Kafka cluster is called as Broker. Kafka cluster saves the messages for predefined period. So even if a consumer is not continuously connected with the cluster, it can keep connecting at specified periods and consume the messages published by that time.

It is upto the producers in which topic and which partition should the message get published. Consumers can be grouped together into consumer groups. When a messages is published, it is delivered to one consumer within the consumer group.

A single Kafka broker can handle terabytes of reads/writes per second from multiple clients.
Messages persist on disk and replicated within the cluster. We can consume a message multiple times since there is no data loss. Kafka is cluster-centric which allows fault-tolerance and durability.


Sunday, 20 April 2014

AnswerReader : An Awesome App in the Making!

AnswerReader is a powerful app to organize and customize Quora, providing a friendly experience.
AnswerReader acts as a single interface for performing various activities. Using Quora from a browser will most likely result in multiple browser tabs being opened. AnswerReader provides a multi-column view of Quora. User can save which all topics he wants in those columns. These saved topics appear as per their order, until the user doesn't remove or change them. User need not set them every time he uses AnswerReader. Any of the columns can be scrolled to main readable area by simply clicking on the topic name in the left navigation panel. AnswerReader provides quick access to most of the profile related info like stats and credits.
Features
  • Create a customized Quora view: Manage columns, shortcuts and much more- all in one app.
  • Boost Productivity: No need to save every time what you want to see in the AnswerReader columns.
  • All in one interface: Manage answers, comments, replies, upvotes, drafts ,posts.
  • Stay Focused: Never miss out anything related to the topic you are most interested in.
  • Multiple Shortcuts: Shortcuts without leaving main page. Get rid of opening multiple browser tabs.
  • Follow without actually following: Keep track of activities of a topic or question even without following it in Quora.
  • Manage What or Whom to Follow: Follow or unfollow question, topic or person.
  • Become a Power User: Everything that you can do on Quora plus ease and multi-column view.
Download from Chrome Web Store: 
AnswerReader is an open source project. Fork it on GitHub:
Note: This tool is under active development. A lot of new features are coming soon.

Thursday, 10 April 2014

Big Data : An Introduction

What is Big Data?

For years together, companies have been making decisions based on analytics performed on huge data stored in relational databases. Saving data in the structured manner in relational databases used to be very costly. Storing and processing huge amount of unstructured data (big data) is much cheaper and faster. This is the main reason why big data has attracted so much of attention. Big data typically has following types:
  • Data from social media sites.
  • Enterprise data including customer related information of CRM applications.
  • Logs of any system- be it related to software or manufacturing.
Big data and its processing are characterised by 3 qualities:
  • Volume : Normally we speak of gigabytes or GBs. Here it goes from tera, peta and exa and so on bytes. And the amount keeps on increasing.
  • Velocity : Relational databases do not scale up in linear way. We expect having same performance even when the data is huge.
  • Variety : Most of the data is unstructured data with a small amount of structured data too.
Why big data is important?

The economic value of big data varies a lot. Sometimes there are indirect advantages of big data as in decision making. Typically there is good amount of information hidden within this big chunk of unstructured data. We should be able to figure out precisely what part of data is valuable and can be used further. This leaves us wondering what do we ultimately do with such a huge amount of data analytics which should be produced as fast as possible. There are so many use cases- Twitter wants to find out most retweeted tweets or trending ones, find tweets containing a particular hashtag, Google has to get the results of so many queries, ad publishers need to know how many new ads have been posted, Quora has to publish the questions posted newly or generate news feed as per every user’s topics and people followed, millions of emails are being sent as notification and much more from so many websites, number of applications downloaded from app store, new article or post published on various news sites to be displayed and a whole lot many things. Especially with the increasing use of smart phones and GPS enabled devices, ad publishers want to display location specific ads or ads of stores located nearby the current location of the user. This helps in targeting the right set of customers.

Challenges

To get maximum benefit of the big data, we should be able to process all of the data together instead of processing it in distinct small sets. Since this data is much larger than our traditional data, the real challenge is to handle it in a way so as to overcome the computational and mathematical challenges. The heterogeneous and incomplete nature of the data is to be tackled in processing. When we do manual computations, we can handle such heterogeneity of data. However when the processing has to be done by machine, it expects the data to be complete and homogeneous. Another challenge with big data is its volume and it keeps on increasing rapidly. On hardware front, this is taken care by increasing the number of cores rather than simply increasing the clock speed and by replacing traditional hard disc drives by better I/O performance storage. Cloud computing too helps handle this volume challenge by being able to process varying workloads. The large volume of data poses the difficulty of achieving timeliness. As larger data is to be processed, it will take more time. However the quickness of getting analysis is very crucial in most of the cases.

Wednesday, 19 February 2014

Independent projects I worked on while being a software developer

Note : Everything mentioned here was developed after regular office hours and mostly for fun/learning purpose only.

When I started my career as a software developer(Java), all I knew was OOPS concepts, Collections, I/O and Exception packages, a bit of Multi-Threading and XMLs (DOM parser only).

Apart from regular day-to-day development, the first personal project I worked on was a file-search-app. Very similar to how Windows file search works. After doing some coding, I was able to: 1.Search in sub-directories 2. Search by file-type/modified-date 3. Search by file name patterns (*VO.*, notes*.txt) etc.
Next, I wanted to create a UI for this app. So learnt Swing and created a nice (If I can say so) UI for the same.
Couldn't find time to do file-indexing to improve search performance. 

Few months later, I got some more interest in Swing and started working on another project - A Java based IDE. It was just for fun and not with the intention to build something better than Eclipse or NetBeans :) After I spent few weekends on coding, I was able to build and run a Java project through my IDE. Auto-suggest for method-names etc. was interesting to develop (Learnt reflection).

It was the 2nd year of my software development career and I was working on web/enterprise apps. I was getting introduced to various web technologies like - JSP, Servlets, Struts, JSF, GWT etc. Influenced by the magic of web technologies, I decided to build my own social network (you can laugh now :)). I knew I would never launch it but it helped me think like Mark Zuckerberg. Someone found this project interesting and finally used it for a closed group networking (with a small user-base). To be honest, what I gave was a very basic version (I realised it is a lot of work and not worth spending that much time on it) which they got enhanced later by others.

Learning so far from my independent projects - Even though I was a software developer in good companies, I was working as a Product Manager, Designer, Architect and Programmer on my pet projects.

My next project started when Google App Engine was launched. It did not take me more than a second to realize that I can now host and run my web applications for free. I was so motivated that I learned python to create my first web app on GAE (as python was the only supported language at that time). And published one more app few months later, but gradually lost interest as I was using Java, Java EE, Spring, Hibernate etc. in my office-related work.  

But hey...wait a second...Google adding Java support to GAE. Is it true?...yes it is...and GAE with Java support was released. I had a big smile on my face!!!

And then I started again and never stopped actually. Till today I have created around 15 apps (using 4 Google Accounts).

Learning so far - apart from what I highlighted in the first part of my answer, I also got the opportunity to learn new language (and related tech-stack), GAE (and hence Cloud computing- IaaS/PaaS/SaaS etc and other cloud service providers) and enjoyed seeing my web applications live (at appspot dot com).

Next - I became the API-maniac. I got into the habit of breathing with APIs. Every week I used to choose some APIs from programmable-web(API directory) and do something with it. Apart from learning API programming, it also helped me win an iPad in PayPal X Developer Challenge.

Chrome Apps and Extensions - Rolling out my ideas in the form of utilities was quick, easy and interesting. For example - 'Java Populars' which I build in half an hour, has 40K+ users. Similarly, News-You-Like and Favorite-Bollywood-Tweets apps got featured in the Digit magazine. I learnt a lot about HTML5 and JavaScript through this and built 20+ apps/extensions so far - 'Quick Chart', 'Simple Task Manager', 'TechCrunch Slides' etc to name a few.

Summary : The entire journey helped me become a better contributor in the main projects (for which I am getting paid).

PS : Getting lazy to share my interest in mobile apps ( and other areas) and what did I do as part of this.

Thursday, 7 November 2013

OGNL Implementation in Struts

Struts 1.x

Struts 1.x used expression language(EL) which uses JSTL as its base.  Struts-EL tag classes are subclasses of Struts tag classes. The EL has basic object graph traversal, but it is not very powerful  and also the  indexed property support is very weak.

Struts 2

Struts 2 was released in 2007 with many exciting features. As compared to Struts 1.x, it simplified the app development task by automating data transfer(form beans to data beans and vice versa as in Struts 1.x) and type conversion(parsing string into double/integer along with exception resolution was to be done in Struts 1.x).

OGNL

With Object Graph Navigation Language(OGNL) in Struts 2, data can be transferred with complex data structures like List and Map. User-defined types can be used with the help of custom converters, which are quite easy to write. OGNL acts as a layer between Struts 2 framework and Java-based processing unit.

ValueStack

OGNL greatly relieves developer from extra coding and maintenance effort. OGNL binds Java-side data directly to the corresponding fields in view layer. Built-in data converters save conversion work while data passes to or from Java environment. Field names in HTML can be generated using OGNL expressions to bind them directly to corresponding Java property and thus eliminates redundant code in Action classes. On contrary to the standard JSP mechanism for binding objects into the page context for access in Struts 1.x, Struts 2 uses ValueStack technology by which  taglibs can access values without coupling view to the object type it is rendering. ValueStack is set as OGNL’s root object. It contains application specific objects like action and also the model objects.

Context Map



OGNL context is set to ActionContext. ActionContext is a container of objects in which action is executed. We get a reference of it by simply calling ActionContext.getContext(). There are other objects in ActionContext like Maps(referred to as context or context map) to represent application, session and request contexts.

Data Access

The root object is referred simply by its name, without any special symbol prefixing it. Since Action instance is always pushed on ValueStack, which is the OGNL root, references to Action properties can omit pound sign. But for rest of the objects in ActionContext, ‘#’ has to be used.
Example: To refer to Action property:


<s:property value=”firstName”/>

For other objects:


<s:property value=”#session.username”/>   Or

<s:property value=”#session[‘username’]/>

Similarly we can refer to properties of request, application or attr (attribute in scope).
Collections can also be referred using OGNL.List is referred by {value1, value2,....}


<s:select label=”Continent” list=”{‘Asia’, ‘Europe’, ‘Africa’} value=”defaultContinent”/>

Alternatively, the list can be populated in Action class with its getter, setter provided.

Map is referred by #{key: value ,....}


<s:select label=”Continent” list=”#{‘first’:‘Asia’, ‘second’:‘Europe’,’third’: ‘Africa’} value=”defaultContinent”/>

In case of Set, we can check whether an item exists in it or not using ‘in’ or ‘not in’.


<s:if test=”’Asia’ in {‘Asia’, ‘Europe’, ‘Africa’}
        Exists in the set
</s:if>

Percent (%) symbol

It is used to force OGNL expression evaluation, which results in querying ValueStack for the property.


<s:property name=”%{continent}”/>  (‘#’ accesses named ValueStack property)

At (@) symbol

It is used to refer static properties and method. We need to enable it in properties file by setting-


struts.ognl.allowStaticMethodAccess=true
and then access like this-

<s:property value="@com.test.TestClass@STATIC_PROP" />

Dollar ($) symbol

Used in JSTL expressions.


‘OGNL’ name might have sounded like something difficult to first time readers. But I’m sure, after reading this post you must be ready to shake hand with OGNL!!


Friday, 6 September 2013

Challenging Work

When do you find a given task to be hard/difficult/challenging? During school days we used to call it as 'hard', in college days as 'difficult' and in job days as 'challenging'!

  1. The work needs to be delivered in less than how much time you estimated for it.
  2. You don't have clarity on the work but delivery date is fixed.
  3. Problem statement is clear but you don't know how to do it.
  4. Multiple tasks are assigned to you and the priority keeps on changing.
  5. You need to take leave for some reason and all of a sudden an easy task becomes challenging.
  6. Sometimes you do not like to work on a particular task and this is why it is challenging for you.
  7. You are stressed and everything looks challenging to you.
  8. When delivery date is very close, work appears as extraordinarily challenging.
  9. Someone couldn't do it and the same work gets delegated to you...especially when that 'someone', you consider to be a techie (Such situation can have positive impact too..you take it up as a challenge!).
  10. You are sincerely working on the task and your boss repeatedly asks you 'Done?'
  11. A combination of the above.

Have I missed any scenario?



Wednesday, 31 July 2013

When to use AngularJS and when to use Backbone!

In this post I am documenting my thoughts on when to use Backbone vs AngularJS with the help of examples (This post is not a comparison between AngularJS and Backbone).

E-Commerce Application

It should not be a single page app (unless we are trying to address few small use-cases or building shopping site for a small merchant). 

We need a framework which provides support for data-binding. Struts OGNL, Spring MVC, JSF EL etc work fine but these are backend technologies. For a modern web-app we need similar support using JavaScript. AngularJS from Google offers this kind of functionality and hence recommended. 

Online HTML/CSS Builder 

Edit functionality becomes more important than view/read. It should be a single page app (User experience is good in this case). 

A lot of UI specific work involved - like drag n' drop, animation effects etc. Text rendering is limited. AngularJS can be used to create a SPA but Backbone is recommended.

Interactive Reporting Tool

Dynamic UI, report editing and cool UI effects. Backbone is recommended. Please not that Backbone alone is not sufficient so we must use a suitable js-tech-stack which is known to work properly with Backbone.

Content Management System

Data representation in TEXT format. Edit functionality using regular forms. 
Single Page App with content getting managed in various Views. AngularJS is recommended.

Question - What will I use if I have to build GMail? 
Answer - AngularJS :)

* In most of the projects where I use Backbone, I define proper architecture for the JS layer, design with the help of a complete js-tech-stack and let backbone play its role (I do not let backbone drive the architecture of front-end).