Translate

Friday, November 28, 2014

Exploring Datomic by ├średev Conference

Datomic is a new database with an intriguing distributed architecture. It separates reads, writes and storage, allowing them to scale independently. Queries run inside your application code using a Datalog-based language. Spreading queries across processes isolates them from one another, enabling real-time data analysis without copying to a separate store, opening full query functionality to clients of your system, and more. This talk explores Datomic's architecture and some of it's implications, focused entirely on technical details.

Monday, August 11, 2014

Running Speedment ACE on Amazon EC2


Speedment ACE is a Graph Database Converter, and a powerful software development tool.  A Graph Database  is a NoSql database that performs best when the relationships between nodes of your data are the most important (and frequently accessed) part of that data.



Speedment ACE builds a Graph Data Grid (GDG) automatically, either from existing tables, or using the Ace Front end to generate those tables.  A great way to check out Speedment ACE is to install it on Amazon Web Services.  Here's how to get going with Speedment in 10 easy steps.

1)  Create an AWS MySql Instance.  

Note the database endpoint, name, username and password then connect form a MySql client.

2)  From the Speedment Programmer's Guide

Create a test schema and sample users table:

CREATE SCHEMA speedment_test; 
USE speedment_test; 
CREATE TABLE `speedment_test`.`user` ( 
 `id` INTEGER UNSIGNED NOT NULL AUTO_INCREMENT,
 `name` VARCHAR(45) NOT NULL,
 `surname` VARCHAR(45) NOT NULL,
 `email` VARCHAR(45) NOT NULL,
 `password` VARCHAR(45) NOT NULL,
 PRIMARY KEY (`id`), UNIQUE INDEX `Index_email`(`email`),
 INDEX `Index_name`(`name`)
) ENGINE = InnoDB;

3) Create an AWS Windows Instance

Be sure to associate the instance with a key pair so you can connect using a Windows RDP Client.  Then in the AWS Console, Launch your new instance, and generate a password using your private key. Also make not of the IP address of your new instance.

4) Connect to your new Windows Instance using an RDP client.

If you have Google Chrome, you can use the 2X Client for RDP/Remote Desktop.

5) Internet Explorer on your new EC2 Windows Instance will ask you

to add every URL you visit for your permission.  You may want to install another browser  to avoid this.

6) Download and install the JSDK 1.7.  

After installing set the JAVA_HOME environment variable to the bin directory where the 'java' executable is located.

7) Download and unzip the Speedment Ace Front End .  

Run it from the ace.bat file located in the bin directory.



8) After registering, Create a new Project.


9) Right-click on the new project and select “Add DBMS”.  

Add the info from the DB instance we created in step 1.

10) Starting on page 28 of the  Speedment Programmer's Guide

you can now explore capabilities of the Speedment Ace front end.  This includes the ability to generate code for rapid application development, and building a Graph Data Grid (GDG), the core of Speedment's optimization capabilities.







Saturday, August 9, 2014

0802 - Intro to Graph Databases by Neo Technology

Join this webinar for a high level introduction to graph databases. This webinar demonstrates how graph databases fit within the NOSQL space, and where they are most appropriately used. In this session you will learn: Overview of NOSQL Why graphs matter Overview of Neo4j Use cases for graph databases

Wednesday, August 6, 2014

Using the Chromebook for Software Development


Recently I compared two nice chromebooks. The Acer 720P  and the HP 11.  Both laptops sell for around $200. I ended up buying the Acer because it has a touchscreen, and the  Haswell processor which has a long battery life.  The HP 11 would have been a great choice too, because it is so lightweight at 2.3 pounds, and charges with a standard USB charger, which would have lightened my travel bag even further.


Since then I've been using the Chromebook as my main software development machine. There are two reasons this works.

1)  Cloud Applications


The best software now runs in the cloud.




Cloud based Integrated Development Environments (IDEs) are the present and future of software development.   You can even share configurations across multiple users.  I work on some legacy projects where up to 80% of developers time is spent on configuring things like Environment Variables on the desktop. With the Cloud IDE,  repeating this nightmare for every new desktop is a thing of the past.

Fantastic utilities like Pixlr for quick graphic design, Google Drive for collaborative documentation,  Trello for task management and Bitbucket for source control make cloud developers instantly productive and happier.

2) Legacy Support


Not every software project is on the cutting edge.   When I need a specific desktop environment such as Windows, I rely on virtual machines to provide me that environment instantly.  Amazon Web Services is the pioneer in making endless resources available to Software Developers.  Their Free usage tier is a must for any engineer.  VMWare, Virtualbox and Windows Azure have also begun to provide virtual machines (VMs) in the cloud.

A hackers paradise




When it's time to play behind the scenes of the operating system,  Linux is a frequent choice.  Thanks to Crouton, You can now have an Ubuntu instance accessible from the Chrombook shell.   I already have legacy java projects running on my linux instance.  I work there when I don't want to utilize cloud resources. Linux is also great for running desktop software that can't be accessed from the cloud, such as the Kerbal Space Program, where I get to spend time above the clouds.


Tuesday, June 3, 2014

Andy Hunt Video Talk Pragmatic Thinking by Ismael Marin

Video talk at the Universidad Iberoamericana Leon in Mexico at the CESLG 08, in which andy talks about his career and his new book Pragmatic Thinking and Learning, he also gives tips to be a better programmer. http://ift.tt/1mOObX5

Thursday, May 22, 2014

Dave Thomas - RubyConf AU 2013 Closing Keynote by Ruby Australia

Dave Thomas needs no introduction to Ruby programmers. As co-author of "Programming Ruby: The Pragmatic Programmers' Guide" - fondly known as the "Pickaxe", Dave was instrumental in spreading Ruby beyond its birthplace in Japan.

Monday, May 19, 2014

What is Fog Computing?


Are you in the mood for a new technology buzzword?  CISCO's marketing department is rolling with the term 'Fog Computing'.
The Fog is an extension of the Cloud.  So in Fog Computing, data is stored closer to the devices we use to access the internet. The benefit is a reduction of bandwidth and latency.
CISCO routers already live in this grey area between out devices and big data providers. CISCO is adding Linux operating systems to some of their routers, and manage frequently accessed data with a distributed network of these router on the edge of the cloud.
by David Flectcher

Naturally several definitions of Fog Computing mention another buzzword, The Internet of Things. But the concept of caching data nearby isn't new.   My paper Smart Technology for Big Data talks about how Facebook caches images close to users based on how likely the images will be accessed.  Today's article Forget 'the Cloud'; 'the Fog' Is Tech's Future in the Wall Street Journal reluctantly embraces the new term.
Welcome to the Fog.

Monday, May 5, 2014

An IT Department at your fingertips : Amazon's EC2

Amazon Web Services is an essential tool for a versatile software consultant. It gives us a vast array of possible configurations. This comes in handy when working out new ideas. But it's also great for installing and testing legacy code.

EC2 - Virtual Servers in the Cloud




The EC2 tab lets us create virtual machines of many flavors. You can test out AWS for a year using the free usage tier.

Don't have a copy of windows handy? Spin up an EC2 instance with Windows, then connect using remote desktop software.  You can also use a Linux box any time you desire.

There's much that can be done with these instances, even install X-Windows on a linux instance and connect from a Chromebook using Chrome Remote Desktop.


Tuesday, January 7, 2014

Big Data Analytics by ICGX

The data revolution has only just begun. Everyone is talking about Big Data. Big Data grows up - Forbes Business opportunities is Big Data - INC. Big Data powers evolution decision making - WSJ How Big Data got so big - NYT Big Data is hot? Now what? - Forbes Businesses "freak out" over Big Data - Information Week 2012: The year of Big Data - WSJ The age of Big Data - NYT But it's not just hype. The world's data is doubling every 1.2 years. There are 7 billion people in the world. 5.1 billion of them owns cell phone. Each day, we send over 11 billion texts, watch over 2.8 billion YouTube videos and preform almost 5 billion google searches. And we're not just consuming it. We're creating it. We are data agents. We generate over 2.4 quintillion bytes everyday from consumer transactions, communication devices, online behavior, streaming service. In 2012, the world’s information totaled over 2 zetabyes. That’s 2 trillion gigabytes. By 2020, that number will be 35 trillion. We will need 10x more servers, 50x more data management, 75x more files to handle it all. If you're like most companies, you aren't ready. 80% of this new data is unstructured. It is too large, too complex, and too disorganized to be analyzed by traditional tools. There are 500K computer scientists yet only 30K mathematicians. We will fall short of the talent need to understand Big Data by at least 100K. To find opportunities in Big Data, we need new tools and new talent to mine this information and find value. We need Big Data Analytics. Big Data Analytics is more than technology. It’s a new way of thinking. It will help companies better understand customers, find hidden opportunities, even help our government better serve citizens and mitigate fraud. It will inspire hundreds, thousands and even millions of new startups. It will alter the landscape across virtually every industry and finally answer the questions looming over every CEO's head, "How can my business use Big Data?", "What problems can it solve?", "Who should be leading the charge, CIO, CMO, or Chief Data Scientists ?". In every revolution, there are opportunities that will be seized only by those armed with the right tools and right strategy. We are at the beginning of the Big Data Revolution.

Thursday, January 2, 2014

Running OpenTSDB on Amazon EC2

Although there are cheaper alternatives for production systems, It's easy enough to get The Open Time Series Database OpenTSDB running on an EC2 instance of Amazon Web Services.

  1. First you'll need to run HBase on EC2
  2. Make a data directory mkdir hbase_data
  3. vi hbase-0.94.13/conf/ hbase-site.xml
  4. Using vi update the hbase.rootdir property value to: file:///home/ec2-user/hbase-0.94.13/hbase-\${user.name}/hbase
  5. sudo yum install git
  6. git clone git://github.com/OpenTSDB/opentsdb.git
  7. sudo yum install automake
  8. yum install gnuplot
  9. cd opentsdb
  10. ./build.sh
  11. env COMPRESSION=NONE HBASE_HOME=path/to/hbase-0.94.X ./src/create_table.sh
  12. tsdtmp=${TMPDIR-'/tmp'}/tsd
  13. mkdir -p "$tsdtmp" 
  14. ./build/tsdb tsd --port=4242 --staticroot=build/staticroot --cachedir="$tsdtmp"
  15. In AWS, click on your EC2 instance, then click "Security Groups" at the bottom left.  Click on the default group, then click the "inbound" tab.  You can now open the ec2 port 4242. 
Your ip address on port 4242 will display the web UI for your instance of OpenTSDB:








  • Thursday, December 26, 2013

    Running HBase on Amazon EC2

    1. Create an Amazon Linux EC2 instance. 
    2. Log into your EC2 Instance using ssh.
    3. sudo yum install java-1.6.0-openjdk
    4. wget http://www.apache.org/dist/hbase/hbase-0.94.13/hbase-0.94.13.tar.gz
    5. tar xfz hbase-*
    6. vi .bashrc
    7. Add this line at the bottom of the file JAVA_HOME=/usr/java/default
    8. sudo vi /etc/hosts
    9. Comment out the localhost line: #127.0.0.1   localhost localhost.localdomain
    10. cd  hbase-*
    11. Start HBase ./bin/start-hbase.sh
    12. Check log files cat logs/hbase-*