Friday, October 7, 2011

A Fallen Legend

Steve Jobs passed away October 2011. Apple is in mourning the loss. Steve lost his battle with cancer. Although we have differences in opinion on what is good for consumers, we also have the same viewpoints. Steve had an uncanny since of what he saw as the future of computering. He strived for perfection and excellents in all he did.

The technology revolution would be totally different if he wasn't a part of it. He pioneered the revolution. He saw a way to market his ideas, and was very good at convencing people that his ideas were not crazy. He will be missed. What is the future of Apple now? Only time will tell.

One of the True Great Idea Man of the 20th Century

Monday, October 3, 2011

Linux New Beginnings

Linux can bring life to old hardware depending on what distribution you use. This is good and bad. It is good because this means if you have an old system then you do not have to spend money on the system in order to know that it is fully patched and updated. It will still provide an acceptable user experience, and your PC will run as good if not better than it did on the previous operating system.

The con is that you shouldn't expect the system to do graphic intensive operations on an old system. Don't try to run the latest version of Ubuntu or Fedora on a 10 year old system.

But old systems are good for the grandparents or people who just want to surf the internet or do wordprocessing. That is a good way to recycle that old PC.

Saturday, June 4, 2011

wget Usage: Super, Elite, Ninja's Guide

Well if you made it this far with the posts I commend you. Well done. You have come a long way. Now is where the real fun starts. :)

A simple way to keep a mirror of a web site is simply use the -mirror (-m) command such as:

wget -m

and you are able to get a log of this activity and what happens by adding the -o weblog to the line which is useful.

There are so many options for wget if you have a problem or would like to see more of what wget will do view the well documented and easy to read help file wget --help

Very nice documentation. Good luck, See you next time.

The Linux Redneck.


Monday, May 30, 2011

wget Usage: The Advance Guide

Well I'm back as promised. This time we are going to really geek it out with some advance features of wget,

You are able to create an exact copy or clone of a website with wget by using the -r and -l aruguments for example:

wget -r -l1

This command will create an exact mirror of the site you specify. Including the directory structure and everything, and it will save the information in a directory called / The -r command means do it recursively. The -l command tells wget how far do you want to dig into the site. If you don't specify a level it will clone the whole site which may take a long time depending on how complex the site is.

I mentioned previously that wget is useful for getting all the pictures or video files from a site well here is how to do it. It is a bit complicated but once you do it once or twice you will get it down.

wget -r -l 2 -np -A.gif

The flags -np means no parent, which means reference to the parent directory is ignored. -A .gif means to accept only files with the .gif extension on it.

Suppose that you were in the middle of downloading some files and whoops the power goes out and your computer is dead what are you to do. YOU START OVER FROM FILE 1 OF 28, 344, 734 FILES. WRONG!!! wget has a solution, and this is why linux is so great. It is prepared for diesasters as long as you know the key.

wget -nc -r

-nc stands for no clobber. In other words do not download files that already exists.

That is it for this post. Check back. I may post a super Ninja posting on get for those brave souls that can't get enough wget goodness.

See you next time,

The Linux Redneck

Tuesday, May 24, 2011

wget Usage: The Beginner's Guide

wget is a pretty nifty little program. wgs. et stands for web get, which does what the name says it will download an entire website if given the appropriate commands and argument. This is what this post and future posts will teach you about. Pretty soon you will be wgetting all sorts of things. Like my papa use to say you have to learn to crawl before you can spit. (I don't know what he meant by that but it is what he said, and sounds like good advice.) So on with the basics:

wget Basics
Say you wanted to get the homepage of a website you would use this syntax:


and what would be got from that command would be the index.html file. It would be saved into whatever directory you ran the wget command in. usually the /home directory in Linux.

Sometime you will run into of an error whenever wget times out. The default time is 25 tries to get a file, and for big files you may not be able to get the file such as a picture or video. So why not change the default? You can do that with the -t command:

wget -t 34

This downloads the file, but since I know that it is a large file I want to give it a longer time period to try to download the file. So I extended the number of tries to 34 attempts.

Say you want to document the output of what you did instead of just having it appear on the screen. You can save the output into a file for review or documentation for later. Just add the -o command like this:

wget -o log 

That is about it for basics of wget. Stay turned for some more hair tingling wget commands coming soon.

The Linux Redneck, signing out. :)

Wednesday, May 18, 2011

Everyday Linux

There is a new podcast and information site on the block it is from the guys that brings "The Tightwad Tech". Everday Linux It is aimed at very beginners. The content is interesting, and I have learned a nugget here and there myself.

 The hosts work for a small school district in Texas. They are not interested in doing or teaching you how to use linux as an admin perspective, but they won't to have ordinary people and the nontechnical to use linux and spread the adoption of Linux. Which I think is great!!!

I can't wait until the next show. I hope they stay motivated to do the show. And last until 10 episodes. (I meant 100000000000) Good luck guys.

This has got me thinking. I will compile a list of linux and technology podcasts that I listen to or watch. I listen to a lot.


The Linux Redneck

Monday, May 9, 2011

Ubuntu 11.04: Natty Narwhal Released: Not That Bad

I have been using Ubuntu 11.04 since April 28, 2011, and I have come to the conclusion that it is not as bad as some people have said. I don't know if it is because I like change and can adapt well under pressure or what. Isn't that what we got into the high tech field for and especially why we started to use Linux because we want and need a change from the ordinary. 

Ubuntu 11.04 is just a different way of looking at things, a new interface, and a new way of getting the job done. It may be because I have gotten use to the interface because of the 10.10 Netbook Remix. That is when I started to really test it, and I fell in love with it.  

I have not had any problems both installing it and upgrading, which I cannot say of previous versions of Ubuntu.  

The way you have to open programs is by searching for that program. Chances are that you will find it either installed or a program to accomplish the task that you can install right from the search dialog box. It will open the Software Center and give you the option to install it. 

This searching feature is not new in operating systems lately, but even so I believe the ability to install the program or giving you a list of choices to install is new. Not even Windows 7 has that feature. But they also don't have a software superstore. :) 

I believe this feature is darn cool I don't care who you are.  

That is one of the small features that makes Natty my distro of choice. I enjoy changes. I hope Ubuntu will continue with innervating and inventing and improving Ubuntu.  

Thanks for your time,

Lance Howell

The Linux Redneck