Archive for the ‘Open Source World’ Category

Liferay 7 m4 setup

Posted: May 15, 2015 by Narendra Shah in Liferay, Uncategorized

I was curious to see osgi in Liferay 7 and that curiosity bring me to install liferay 7.

How easy setup liferay.
1. Download Liferay 7 m4 tomcat.zip from http://sourceforge.net/projects/lportal/files/Liferay%20Portal/7.0.0%20M4/
2. unzip the file
3. If you have jdk 7/8 already setup with environment variable JAVA_HOME, you are ready to go to start
4. Here intention is not to use some database and run it. so i tried with hsql itself, liferay default database.
5. Navigate to directory(in zip extracted folder) liferay-portal-7.0-ce-m4\tomcat-7.0.42\bin
6. Startup.bat/startup.sh
7 AFter liferay started, i see basic config with name,email address etc.
8. Filled in basic detail and press Finish button, it took a while and see done. I see Term & Conditions to agree and my portal is ready. Here is some screen shot for first look.

Whooo, it is working cool, much faster than old versions. UI is very cool, everything is properly categorized and it is really well managed. Though the basic concepts, site,layout. control panel, role, web content etc is as is.

 

Control panel is properly Categorized as User, Sites, Apps, Configuration, here apps is portlet config, where portlet can be activated/deactivated etc.

 

 

 

I was looking for log analyser for logs from our sytem, i found this was interesting.

1. Scribe – Real time log aggregation used in Facebook
Scribe is a server for aggregating log data that’s streamed in real time from clients. It is designed to be scalable and reliable. It is developed and maintained by Facebook. It is designed to scale to a very large number of nodes and be robust to network and node failures. There is a scribe server running on every node in the system, configured to aggregate messages and send them to a central scribe server (or servers) in larger groups.

https://github.com/facebook/scribe

2. Logstash – Centralized log storage, indexing, and searching

Logstash is a tool for managing events and logs. You can use it to collect logs, parse them, and store them for later use. Logstash comes with a web interface for searching and drilling into all of your logs.

http://logstash.net/

3. Octopussy – Perl/XML Logs Analyzer, Alerter & Reporter
Octopussy is a Log analyzer tool. It analyzes the log, generates reports and alerts the admin. It has LDAP support to maintain users list. It exports report by Email, FTP & SCP. Scheduled reports could be generated. RRD tool to generate graphs.

http://sourceforge.net/projects/syslog-analyzer/

4. Awstats – Advanced web, streaming, ftp and mail server statistics
AWStats is a powerful tool that generates advanced web, streaming, ftp or mail server statistics graphically. It can analyze log files from all major server tools like Apache log files, ebStar, IIS and a lot of other web, proxy, wap, streaming servers, mail servers and some ftp servers. This log analyzer works as a CGI or from command line and shows you all possible information your log contains, in few graphical web pages.

http://awstats.sourceforge.net/

5. nxlog – Multi platform Log management
nxlog is a modular, multi-threaded, high-performance log management solution with multi-platform support. In concept it is similar to syslog-ng or rsyslog but is not limited to unix/syslog only. It can collect logs from files in various formats, receive logs from the network remotely over UDP, TCP or TLS/SSL . It supports platform specific sources such as the Windows Eventlog, Linux kernel logs, Android logs, local syslog etc.

http://nxlog.org/

6. Graylog2 – Open Source Log Management
Graylog2 is an open source log management solution that stores your logs in ElasticSearch. It consists of a server written in Java that accepts your syslog messages via TCP, UDP or AMQP and stores it in the database. The second part is a web interface that allows you to manage the log messages from your web browser. Take a look at the screenshots or the latest release info page to get a feeling of what you can do with Graylog2.

http://graylog2.org/

7. Fluentd – Data collector, Log Everything in JSON
Fluentd is an event collector system. It is a generalized version of syslogd, which handles JSON objects for its log messages. It collects logs from various data sources and writes them to files, database or other types of storages.

http://fluentd.org/

8. Meniscus – The Python Event Logging Service

Meniscus is a Python based system for event collection, transit and processing in the large. It’s primary use case is for large-scale Cloud logging, but can be used in many other scenarios including usage reporting and API tracing. Its components include Collection, Transport, Storage, Event Processing & Enhancement, Complex Event Processing, Analytics.

https://github.com/ProjectMeniscus/meniscus

9. lucene-log4j – Log4j file rolling appender which indexes log with Lucene
lucene-log4j solves a recurrent problem that production support team face whenever a live incident happens: filtering production log statements to match a session/transaction/user ID. It works by extending Log4j’s RollingFileAppender with Lucene indexing routines. Then with a LuceneLogSearchServlet, you get access to your log using web front end.

https://code.google.com/p/lucene-log4j/

10. Chainsaw – log viewer and analysis tool
Chainsaw is a companion application to Log4j written by members of the Log4j development community. Chainsaw can read log files formatted in Log4j’s XMLLayout, receive events from remote locations, read events from a DB, it can even work with the JDK 1.4 logging events.

http://logging.apache.org/chainsaw/

11. Logsandra – log management using Cassandra
Logsandra is a log management application written in Python and using Cassandra as back-end. It is written as demo for cassandra but it is worth to take a look. It provides support to create your own parser.

https://github.com/jbohman/logsandra

12. Clarity – Web interface for the grep
Clarity is a Splunk like web interface for your server log files. It supports searching (using grep) as well as trailing log files in realtime. It has been written using the event based architecture based on EventMachine and so allows real-time search of very large log files.

https://github.com/tobi/clarity

13. Webalizer – fast web server log file analysis
The Webalizer is a fast web server log file analysis program. It produces highly detailed, easily configurable usage reports in HTML format, for viewing with a standard web browser. It andles standard Common logfile format (CLF) server logs, several variations of the NCSA Combined logfile format, wu-ftpd/proftpd xferlog (FTP) format logs, Squid proxy server native format, and W3C Extended log formats.

http://www.webalizer.org/

14. Zenoss – Open Source IT Management
Zenoss Core is an open source IT monitoring product that delivers the functionality to effectively manage the configuration, health, performance of networks, servers and applications through a single, integrated software package.

http://sourceforge.net/projects/zenoss/?source=directory

15. OtrosLogViewer – Log parser and Viewer
OtrosLogViewer can read log files formatted in Log4j (pattern and XMLL yout), java.util.logging. Source of events can be local or remote file (ftp, sftp, sa ba, http) or sockets. It has many powerful features like filtering marking, formatting, adding notes, etc. It could also format SOAP messages in logs.

https://code.google.com/p/otroslogviewer/wiki/LogParser

16. Kafka – A high-throughput distributed messaging system
Kafka provides a publish-subscribe solution that can handle all activity stream data and processing on a consumer-scale web site. This kind of activity (page views, searches, and other user actions) are a key ingredient in many of the social feature on the modern web. This data is typically handled by “logging” and ad hoc log aggregation solutions due to the throughput requirements. This kind of ad hoc solution is a viable solution to providing logging data to Hadoop.

https://kafka.apache.org/

17. Kibana – Web Interface for Logstash and ElasticSearch
Kibana is a highly scalable interface for Logstash and ElasticSearch that allows you to efficiently search, graph, analyze and otherwise make sense of a mountain of logs. Kibana will load balance against your Elasticsearch cluster. Logstash’s daily rolling indicies let you scale to huge datasets, while Kibana’s sequential querying gets you most relevant data quickly, with more as it becomes available.

https://github.com/rashidkpc/Kibana

18. Pylogdb

A Python-powered, column-oriented database suitable for web log analysis pylogdb is a database suitable for web log analysis.

http://code.ohloh.net/project?pid=&ipid=129010

19. Epylog – a Syslog parser
Epylog is a syslog parser which runs periodically, looks at your logs, processes some of the entries in order to present them in a more comprehensible format, and then mails you the output. It is written specifically for large network clusters where a lot of machines (around 50 and upwards) log to the same loghost using syslog or syslog-ng.

https://fedorahosted.org/epylog/

20. Indihiang – IIS and Apache log analyzing tool
Indihiang Project is a web log analyzing tool. This tool analyzes IIS and Apache Web logs and generates real time reports. It has Web Log Viewer and analyzer. It is capable to analyze the trend from the logs. This tool also integrate with windows Explorer so you can attach a log file in to indihiang tool via context menu.

http://help.eazyworks.com/index.php?option=com_content&view=article&id=749:indihiang-web-log-analyzer&catid=233&Itemid=150

Starting With Liferay development environment setup

Posted: December 3, 2013 by Narendra Shah in Liferay
Tags: , ,

This post is for first time java developer, who would like to use and setup Liferay.

The steps required to start liferay in your localsystem is easy.

1. Get portal.zip with tomcat from liferay download link

2. Unzip to any folder, unzipped folder contains portal within that there is tomcat within.

3. You can open tomcat/bin folder from portal folder, and then run startup.bat(in linux it is startup.sh).

4. This will start liferay with default configuration,h  so once server started, you can open the server hostname:8080, and you will see wizard for configuration of jdbc configuration for database and default user setup.  Here one thing is, you need to create blank database in your favourate RDBMS tool, and once liferay starts it will create all required tables,indexed etc.

5. Thats it, your liferay instance is started in your local.

6. Now you can play with your instance.

Will post more advanced stuff in next post.

Reviewed Liferay Portal system Development Book

Posted: March 18, 2012 by Narendra Shah in Liferay, Open Source World

This book contains lots of in depth content related to liferay inbuild functionality. This book is recommended for Java/J2EE and liferay basic knowledge person, who want to dig in Liferay book. I did not found this much ready mad material on liferay documentation also. For this book, rather to review whole content, i will add chapter by chapter review.

Chapter 1: Liferay Enterprise Portal: This chapter contains information related to different liferay solution, what are development methodologies, when to use where, development stages etc. It contains abstract information about Liferay and way of working with liferay.

Chapter 2: Service-Builder and Development Environment: This chapter specify development environment setup, portal and plugin structure, liferay very important feature called service builder and at last plugin development. This chapter lacking some examples but at the same time, provides very in depth knowledge on features.

Chapter 3: Generic MVC Portlets: Here all information related to plugin portlet is shown with very useful information like What’s Happening Here,  this section specifies very good information that what is happening the given codebase. This chapter contains all xml,xsd, database configuration related stuff in very nice way.  Here i found i am lacking some information related to spring portlet, it might be added and some database persistent model like hibernate/ JPA, and there is no information regarindg Inter portlet communication information.

Chapter 4: Ext Plugin and Hooks: This chapter specify where to use ext, where to use hook, some advanced tricks for liferay overriding(book specify overwriting but i think overriding is better word to use).

Chapter 5: Enterprise Content Management: This chapter starts with image management, permission, document management WebDav functionality, record management and ends with content authorizing. This chapter contains lots of reference material. It is not necessary that each part is important for an person, who is initial state to learn. Each part applies on different functionalists and could be used as and when needed. Great reference chapter to work with inbuilt content management portlet.

Chapter 6: DDL and WCM: DDL is not data defination language of sql, it is related to web content management related to dynamic data list and dynamic data mapping. I was little confuse when started, but in the first paragraph, understood this. This chapter is dedicated to WCM Funcationlity of liferay. and it also contains information related to localization, template, types of files used. Good to refer the chapter for some more indepth knowledge on liferay web content fictionality.

Chapter 7: Collaborative and Social API: This chapter related to blog, message boards,asset management, chat type portlet from liferay. It contains ready to use info to directly work with that functionality and backend modification if required framework information. Good to go thru this chapter before starting to use liferfay inbuilt collaberation and social media portlets.

Chapter 8: Staging, Scheduling,Publishing, and Cache Clustering: This chapter is useful after developing all required portlet, and running in staging environment, configuration for deployments like different cache configuration,  portal instance configuration, group, layout, stading and publishing, Nicely arranged information in this chapter is useful after development and before publishing server.

Chapter 10: Indexing, Search, and Workflow: This chapter relates to SEO, auto complete , open search, workflow stuff. It contains information for which i have very less knowledge :). I gone thru the chapter, i feel i need to revisit the chapter.

Chapter 11:Mobile Devices and Portlet Bridges: It includes information related to layout and themes, and many differenet technologies portlet creation.

Final thoughts: This book contains lots of in depth information related to liferay, which shows authers in depth knowledge on liferay. I dont think any good liferay developer also knowing all this stuff. At the same time, i found book lacks some examples.I loved this book at first time only. So it is obviously great book to have.

I am very thank ful to Team of Packt publication for giving chanse to review the book. I hope review will be helpful to auther and packt team.

I have already provided my initial reviews. Now writing final reviews about the book.

In Book Liferay Beginners Guide, at the first look itself, found that contents are well arranged, it contains little theory first and then example. So if person follows complete book.  He/she can learn liferay easily. I appreciate efforts by Cignex Datamatics Team. Good Luck.

Good content in Book

  •  For starting, learning and setting up liferay this book is the first choice.
  • Contains enough examples and exercises(Time for action contents)
  • It contains all liferay inbuilt portlet, which shows how they could be used in best way.
  • Contains initial setup and requirements related information

This book is for those person who do not know anything about liferay. and want to learn and searching for some good startup. Their searches ends here.

If one wants design a website based on liferay, he can easily create that after understanding and reading book.

Book contains lots of things which may be available in liferay documents, but i appreciate simplicity of the book.

Wishing All The Best to Book Authors and Publisher.

Since last one year, i am working with liferay projects. I was looking for some good startup book on liferay. There are lots of companies are moving towards Liferay. And at developer stand, it is really good to learn liferay. According to me why i like liferay, as such some of the built in tool are stable and good. and allmost with all of the java library and other library like php, .net, there is integration possible. So as a developer you are limitless to provide integration support for your application.

In Book Liferay Beginners Guide, at the first look found that contents are well arranged, it contains little theory first and then example. So if person follows complete book.  He can learn liferay easily. I appreciate efforts by Cignex Datamatics Team. Good efforts.

Good content in Book

  •  For starting, learning and setting up liferay this book is the first choice.
  • Contains enough examples and exercises(Time for action contents)
  • In built portlet knowledge.
  • Contains initial setup and requirements related information.

Lackings:

  • No development information, how to develop portalet with liferay.
  • Integration support is not added.
  • Liferay 6.1 is on a way, and as such it is paper print, so no information within book. And for older release also no information.
  • I may expect advance version of the book.

I will add more summary and reviews for the book in between of reading and completing reading book.

Create/Edit iso image in linux

Posted: May 21, 2009 by Narendra Shah in Linux Box
Tags: ,

It’s rather trivial task to make some changes into already burned installation or live CD. It may be performed to add some files to this CD or edit files on it. In any case it’s impossible to loop mount .iso file and then save it as iso9660 filesystem is read-only.

So, just mount your CD or iso image to some directory by commands:

sudo mkdir /mnt/image
sudo mount /dev/cdrom /mnt/image

or


sudo mount /path/to/your.iso /mnt/image -o loop

then copy it’s contents to some directory:

mkdir /tmp/newiso
cp -r /mnt/image /tmp/newiso

After this you can modify any files in /tmp/newiso, add files, delete them. After modifications are done, create new ISO image to be burned onto CD (or kept somewhere for a rainy day):

cd /tmp/newiso

and

sudo mkisofs -o /tmp/new.iso -b isolinux/isolinux.bin -c isolinux/boot.cat -no-emul-boot -boot-load-size 4 -boot-info-table -J -R -V “new iso name” .

After mkisofs is finished new ISO file will be created at /tmp directory.

About OS install on USB pen Drive

Posted: March 23, 2009 by Narendra Shah in Linux Box, Reserch And Development

After long time i am back with the Operating System on USB or pen drive. See i am not forcing you to install the linux on pen drive. But some times it is very helpful to you to have Any OS handy on your hand. Just put your usb in any machine and your likely OS is ready to use.

Today let me instroduce one software which works with both linux and windows to install iso to your pendrive. Really very nice tool and very user friendly also. I am looking for such tool from long time. One more thing this is open source.

http://unetbootin.sourceforge.net/

UNetbootin allows you to create bootable Live USB drives for a variety of Linux distributions from Windows or Linux, without requiring you to burn a CD. You can either let it download one of the many distributions supported out-of-the-box for you, or supply your own Linux .iso file if you’ve already downloaded one or your preferred distribution isn’t on the list.

And one more thing i am currently studying debian OS named Backtrack 4, which includes tool for hacking. If you are interested get it from the net and work with again USB and enjoy Hacking and not cracking. and know your network activity.

–Narendra Shah.

This time i am being late to put blog. Today i am came with the concept that install linux on pen drive. 

Now the question stays that why to install linux on usb pen drive ?

I have seen many people who dont want to experiment new OS on their laptop/computer harddisk. because generally afraid of hard disk failure.  And that is one reason that linux is not used. 

Some users think it is difficult to install linux on haddisk ….. multi boot and removal problem and many more, so not play with it. Frankly speaking I am also with them. 

Now days USB cost very low approximately 2GB @ Rs 250 / $4 and 8 GB @ Rs 750 / $6 with good brand name. It is better to use any of them other then using your own hard disk.

Now let me come to the point. I found very easy way to install fedora linux on usb drives.  Not to say more about fedora. It is very nice OS and used by many people. I also like it. 

To intall you just need to download liveusb-creator which is compatible with windows(8.9MB) and linux(183kb).

https://fedorahosted.org/liveusb-creator/wiki

Get it from the site and follow the step given by him. and your Fedora Linux is ready. And you can use it when ever you like on your laptop / computer, without any problem. 

Come out of that costly Windows and have one look over how is linux ? And then decide which one is better. 

I hope you will try it now. 

Linux Performance

Posted: November 14, 2008 by Narendra Shah in Linux Box
Tags: , ,

1. How to make things run faster?

This actually is not a problem. Most of the people who are new to Linux usually are surprised by the speed it works at. They love to find the machine responding faster and the decrease in system boot up and application launch response. But for enthusiasts it is never enough. They would always want to have a bit more. They would just push the machine to its extreme and thats what they rejoice in.

From another perspective, we may need our computer to do a particular job faster than other jobs. For example, one may want that the CD burning program is allocated more resources so that CD is not corrupted or one may like that Databases respond quicker.

For this to be accomplished, you need to run the program with a higher priority. In the Linux world, the term ‘niceness’ corresponds to priority. The second thing to be remembered is that a program with LESS niceness is more prior than others. The easier was to remember is that a person who is less nice to others will prefer HIS work before allowing his neighbors to use the resources. So a less nice process will ask for allocation of resources more towards itself than towards the other processes.

The niceness of a process in Linux will typically range from -20 (greatest priority) to +19 (least priority). So if you need to run a process with more priority, run it with less niceness. The easiest way to do this is as follows :

Code: Select all
# nice --adjustment=<niceness> COMMAND

This will run the COMMAND with the specified niceness. nice command is used to set priority of Process in linux. The thing to be noticed is that the nice command can be used only by the root. So you need to be the root before running it. However that would launch the process as root which is not desirable. So to make sure that the nice program is executable by all the users, run the following command as root:

Code: Select all
# chmod +s /usr/bin/nice

Now, an example for the nice command ca be as follows:

Code: Select all
#nice --adjustment=-15 vmware

Runs VMware with a much higher priority (-15) than the normal (0).

NOTE: Remember that you should not run any process with a niceness lower than -15 (without knowing what actually is done) as this may make other processes really slow depending on what the process does.

Let me discuss what benefits I get by using the above example:

Since I run my Windows OS with only one processor power allocated to it which is fairly OK in most of the cases where heavy multitasking is not required. But when I am doing some multitasking in the Virtual Machine but do not want to get my system slow which I use for downloading, copying files and listening to music at one time, I am at a loss. In this case, I run the VMware with a higher priority (and a lower niceness) so that all the power of the ONE processor goes to the Windows in virtual environment and makes it run faster (however multiple machines at one time will take away more power from both processors) and my real machine works at a fair speed as well.

So if you understand what your program does, you can very well run each and every program at a favorable speed.

NOTE: DO not be in the illusion that if you launch all the programs with a higher priority, all of them will run much fast. Niceness of a process is RELATIVE TO OTHER PROCESSES and hence only a few programs which are needed to be run faster can be launched with a lower niceness, not all of them.

 

I will add more performance hits over here as time goes.