Home Blog Page 19

The Anatomy of an IP Address: Exploring Its Role and Importance

0
The Anatomy of an IP Address: Exploring Its Role and Importance

Have you ever stopped to think about the distinct digital identity you carry with you when you browse the internet? Known as ‘My IP Address,’ your identity acts as your digital passport throughout the vast internet.
It’s more than simply a series of digits; it’s the key that opens the virtual door to digital communication, enabling smooth international conversation across devices.
Consider my IP address as the unsung hero of the internet, a pillar supporting the complex network of online connection. It’s essential to our everyday digital activities, handling anything from network traffic management to safe surfing.
Comprehending the subtleties of your IP address is crucial, regardless of your level of interest in technology.

The Basics of IP Addresses
Understanding IP addresses begins with recognizing their fundamental role in the digital world. An IP (Internet Protocol) address is akin to your home’s street address, but in the vast network neighborhood of the internet.
It’s a unique set of numbers assigned to each device that connects to the internet, ensuring that data reaches its correct destination.
Defining IP Addresses
An IP address is a numerical label, like 192.168.1.1 or 3ffe:1900:4545:3:200:f8ff:fe21:67cf, assigned to every device connected to a network. It operates as a locator and an identifier, guiding the flow of internet traffic.

IPv4 vs. IPv6

There are two versions of IP addresses. IPv4, the older format, uses a 32-bit address scheme, limiting it to around 4 billion unique addresses. With the explosive growth of internet-connected devices, IPv4 addresses are running out.
Enter IPv6, which uses a 128-bit address scheme, offering a near-infinite pool of addresses. Imagine transitioning from a small town to a global metropolis in terms of address capacity.
Static and Dynamic IP Addresses
Most home networks use dynamic IP addresses, which change periodically. In contrast, static IPs remain constant, commonly used by businesses for reliable network accessibility.
Think of dynamic IPs as temporary rental homes that change periodically, whereas static IPs are like owning a permanent residence.
How Devices Get IP Addresses
When your device connects to the internet, it’s assigned an IP address by your Internet Service Provider (ISP) through a process called DHCP (Dynamic Host Configuration Protocol).
It’s like checking into a hotel and being assigned a room number, which becomes your identity during your stay.
The Role of IP in Connectivity
Every time you visit a website or send an email, your device’s IP address is working diligently. It ensures that the data you request or send reaches the right destination, much like a postal service delivering mail to the correct address.
How IP Addresses Function
Having established the basics of IP addresses, let’s explore how they function in the grand scheme of internet communication. This chapter will illustrate the operational mechanics of IP addresses, enhancing your understanding of this vital cog in the digital machine.
The Communication Process
Think of the internet as a global postal system. Just as a letter needs a destination address and a return address, digital data packets sent over the internet require source and destination IP addresses.
When you send an email or access a website, your request is packaged into data packets, each stamped with your IP address (sender) and the website’s IP address (recipient).
These packets travel through various network points—routers, servers, and switches—before reaching their destination, akin to postal sorting centers.
Routing and Addressing
Each router on the internet has a specific role: to direct traffic to its intended destination efficiently.
When data packets arrive at a router, it examines the destination IP address and decides the best route to forward them, much like a traffic officer managing road intersections.
This process ensures your data navigates through the internet’s labyrinth swiftly and accurately.
IP and TCP/IP Protocol Suite
IP addresses are part of the larger TCP/IP protocol suite, which governs internet communication.
While IP addresses ensure correct data delivery, TCP (Transmission Control Protocol) assures that data packets arrive intact and in order.
Imagine IP as the postal system (routing letters) and TCP as the quality control ensuring every piece of the letter reaches its destination in the right order.
Public vs. Private IP Addresses
Devices on a home or office network have private IP addresses, visible only within the local network. Your router has a public IP address, serving as the face of your network to the wider internet.
This distinction is crucial for security and efficient network management, acting as a gatekeeper deciding what internal data reaches the external world.
Network Address Translation (NAT)
NAT is a method used by routers to translate private IP addresses to a public one and vice versa. This process enables multiple devices on a private network to share a single public IP address. It’s like having several internal phone extensions connected to one main phone number.
Subnetting and CIDR
Subnetting divides a network into smaller, manageable pieces, while Classless Inter-Domain Routing (CIDR) allows more flexible allocation of IP addresses. These processes optimize network performance and address allocation, akin to organizing a city into districts and neighborhoods for better management.
Conclusion: Embracing the Digital Signature of IP Addresses
These distinct identities are fundamental to a safe, well-structured, and efficient online environment; they do more than just allow communication.
Your IP address, a little but powerful force in the enormous digital universe, is the starting point for all of your online activities. With this knowledge in hand, you can improve your online interactions and experiences as you go further into the digital realm.
To move ahead from here you can read

See you in the next article!
TREND OCEANS is an independent Information Technology web portal. Our belief is to make every individual aware of Linux, Open-Source, and Coding that help you learn and sharpen your tech skill effectively with our high-quality daily content.

Passhunt on Kali Linux — Exposes the Nightmare of Default Passwords

0
Passhunt on Kali Linux -- Exposes the Nightmare of Default Passwords

Hope you are all doing awesome. As always, today we’re agin going to learn something from the basics of cybersecurity world, where a seemingly harmless string of characters can become our nightmare. Yes, you guessed it—we’re talking about default passwords. Not only that We also learn how we can use a tool called Passhunt on our Kali Linux system and find loopholes on various devices.Now, imagine we’re working on our own home or office. We’ve got all these trusty devices—like routers, webcams, printers—keeping our connection the world and running our work smoothly. But here’s the catch: many of these devices come with default usernames and passwords straight out of the box.At first glance, default passwords might seem like a easy shortcut. After all, who wants to spend time to set up a new password when we can just use the one that’s already there, right? Wrong! Default passwords are like leaving the front door of our system wide open for bad characters. Devices with default credentials aren’t just in danger—they are the danger. Anybody can enter the system without knocking the door.Let we have a look how default passwords can lead to some serious cyber attacks on us.Network Nets: Imagine that—we’ve got our brand new router up and running, blissfully unaware that it’s still using the default password. Along comes a crafty bad attacker who sniffs out our router’s vulnerability faster than we can type “password123.” With access to our router, the attacker can wreak havoc on our entire network—slowing down our internet, eavesdropping on our data, or even launching full-blown cyber attacks. It’s like handing over the keys to our system on a silver platter!Web Woes: Next up, we’ve got webcams. These little things might seem innocent enough, but with default passwords, they’re serious trouble makers. Imagine a attacker gaining access to our webcam without we even knowing it. They could spy on our every move, invade our privacy, or even use the footage for blackmail. It’s like having an unwanted guest lurking in the shadows of our home, and nobody wants that!Printer Predicaments: Last but not least, let’s talk about printers. Yes, even these humble machines aren’t safe from the perils of default passwords. With access to our printer, a attacker could intercept sensitive documents, manipulate print jobs, or even launch a good old-fashioned printer hack (yes, that’s a thing!), gaining unauthorized access to sensitive documents stored on the printer’s memory. It’s like turning your trusty printer into a cyber tool, ready to strike at a moment’s notice.So, what’s the moral of the story? It’s simple: always, always, always change default passwords. Whether it’s for our router, our webcam, our printer, or any other device in our supervision, we shouldn’t take shortcuts when it comes to security. Humans are the most weakest point in any security system. Humans are normally stupid, not really, a better word would ignorant, they don’t aware how stuff works. We just want that our router runs perfectly for Netflix and chill.Enough talking about layman’s who ever reading this considers as a cyber security expert and we cyber security experts didn’t came here to learn those above things. We wanna test these kind of vulnerabilities due to human errors. Passhunt is a Python script which is going to help us on this matter.This Passhunt tool originally created by Viral Maniar which can search through 523 vendors and their 2084 default passwords. It works like a database. Passhunt was giving some issues on newer Python and Kali Linux version. We have updated this script and removed some unwanted lines to run it on our Kali Linux system.Let’s fire up our Kali Linux Terminal and type following command to clone Passhunt from Github:git clone https://github.com/jaykali/PasshuntAfter pressing Enter ↲ button on our keyboard Passhunt will be downloaded on our system as we can see in the following screenshot.Now we are going to use cd command to go to the Passhunt directory. The command will be following:cd PasshuntNow we need to install required dependencies to run Passhunt and the command will be following:pip3 install -r requirements.txtIt may take some seconds depending on our internet speed and system configuration. In the following screenshot we can see the output of above commands.Now we are going to run Passhunt by using following command:python passhunt.pyNow our password hunter Passhunt is ready and running as we can see in the following screenshot.In the above screenshot we can see that there are 3 options only on the main menu.List supported vendors: By pressing 1 and Enter we can see all the supported vendors (brands) names list.Search Default Password: We can Hit 2 and Enter to search for a Vendor’s default credentials.Exit: To quit this tool we just need to type 3 and hit Enter.Now assume we have encountered with a D-Link device, and we are thinking to try default passwords login on this then we can search for it on Passhunt by pressing 2 and hit Enter ⤶.Then it will prompt for vendor’s name we can use D-Link in our case. After that we are able to see the list of D-Link devices and their default credentials. Here we can choose our target device as we can see in the following screenshot:This is how we can search for default password database using Passhunt on our Kali Linux system. Nowadays default password devices are rare every device have it’s own default password written on somewhere in the packets. But there are some web services which are old and not updated for a while even offices, universities, households use very old model routers for saving some bucks. Humans are Humans are ignorant they really don’t care about these online security. They uses same passwords on internet banking, social media even in shopping sites also. They don’t change the default passwords of devices. Those things are really scary.One more thing, If our target device is a router and it have a WPS button (ultra modern routers usually don’t have WPS button, others may have) then we can also try Pixie Dust Attack to crack it.That is it for today. We already knows that default passwords are very risky and learn something new i.e using Passhunt on our Kali Linux system and search for default passwords and credentials.Love our article? Make sure to follow us on Twitter and GitHub, we post article updates there. To join our KaliLinuxIn family, join our Telegram Group & Whatsapp Channel We are trying to build a community for Linux and Cybersecurity. For anything we always happy to help everyone on the comment section. As we know our comment section is always open to everyone. We read each and every comment and we always reply.

Over 14 Million Servers May Be Vulnerable To OpenSSH’s ‘RegreSSHion’ RCE Flaw

0
Security

An anonymous reader quotes a report from ZDNet, written by Steven Vaughan-Nichols: Hold onto your SSH keys, folks! A critical vulnerability has just rocked OpenSSH, Linux’s secure remote access foundation, causing seasoned sysadmins to break out in a cold sweat. Dubbed “regreSSHion” and tagged as CVE-2024-6387, this nasty bug allows unauthenticated remote code execution (RCE) on OpenSSH servers running on glibc-based Linux systems. We’re not talking about some minor privilege escalation here — this flaw hands over full root access on a silver platter. For those who’ve been around the Linux block a few times, this feels like deja vu. The vulnerability is a regression of CVE-2006-5051, a bug patched back in 2006. This old foe somehow snuck back into the code in October 2020 with OpenSSH 8.5p1. Thankfully, the Qualys Threat Research Unit uncovered this digital skeleton in OpenSSH’s closet. Unfortunately, this vulnerability affects the default configuration and doesn’t need any user interaction to exploit. In other words, it’s a vulnerability that keeps security professionals up at night.
It’s hard to overstate the potential impact of this flaw. OpenSSH is the de facto standard for secure remote access and file transfer in Unix-like systems, including Linux and macOS. It’s the Swiss Army knife of secure communication for sysadmins and developers worldwide. The good news is that not all Linux distributions have the vulnerable code. Old OpenSSH versions earlier than 4.4p1 are vulnerable to this signal handler race condition unless they are patched for CVE-2006-5051 and CVE-2008-4109. Versions from 4.4p1 up to, but not including, 8.5p1 are not vulnerable. The bad news is that the vulnerability resurfaced in OpenSSH 8.5p1 up to, but not including, 9.8p1 due to the accidental removal of a critical component. Qualys has found over 14 million potentially vulnerable OpenSSH server internet instances. The company believes that approximately 700,000 of these external internet-facing instances are definitely vulnerable. A patch, OpenSSH 9.8/9.8p1 is now available. Many, but not all, Linux distributions have made it available. If you can get it, install it as soon as possible. If for whatever reason you’re not able to install a patch, Vaughan-Nichols recommends you set LoginGraceTime to 0 in the sshd configuration file and use network-based controls to restrict SSH access, while also configuring firewalls and monitoring tools to detect and block exploit attempts.

The Impact of Pre-Order Website Templates on Customer Anticipation

0
Sohail

In e-commerce, businesses constantly look for inventive approaches to improve the customer experience and drive engagement. One strategy that has gained notable traction is the adoption of pre-order website templates.This article aims to get into the intricacies of pre-order website templates, unravel their significance, describe their benefits, and illustrate their effectiveness with examples of successful implementations. After this article, you will understand how to create perfect pre-order website templates with customer anticipation in mind and how these templates contribute to a more compelling and engaging online shopping experience.Understanding the role of pre-order website templateA pre order website template plays an important role in creating effective web pages for products or services that have not yet been released. This template allows companies to pre-collect interest and orders from potential customers, creating a stream of pre-orders even before the products are released. Such a template can be a marketing strategy for launching new products or services.Benefits of Using Pre-Order Website Templates1. Generate early interest and anticipationPre-order templates allow businesses to generate early interest and build anticipation for upcoming products or services.By showcasing teasers, sneak peeks, and exclusive information on the pre-order page, we can entice customers to purchase the upcoming release.2. Market research and demand forecastingPre-orders serve as a valuable source of market research, providing insights into the demand for a product before it hits the market.Analyzing pre-order numbers helps businesses forecast demand, adjust production levels accordingly, and plan marketing strategies more effectively.3. Customer engagement and loyaltyOffering exclusive pre-order incentives, such as discounts, limited edition items, or early access, enhances customer engagement and loyalty.Customers feel valued when they can be among the first to experience a new product, fostering a sense of connection with the brand.4. Efficient inventory managementPre-orders enable businesses to gauge the demand for a product before its official release, allowing for more efficient inventory management.This helps avoid overstock or stockouts, optimize the supply chain, and reduce the risk of excess inventory.5. Cash flow and financial planningCollecting payments through pre-orders provides a steady cash flow even before the product is launched.This can be crucial for businesses, especially small and medium-sized enterprises, as it aids financial planning and covers initial production costs.6. Reduced time-to-marketBy initiating pre-orders, businesses can gauge market interest early, allowing for adjustments to production schedules.This can potentially reduce the time it takes to bring a product to market, giving the business a competitive edge.7. Enhanced SEO and online visibilityPre-order pages contribute to SEO efforts, creating additional content and links for the upcoming product.This can improve online visibility and attract potential customers searching for information on the product before its release.How to Create an Ideal Pre-Order Website Template Based on Customer AnticipationCreating an ideal pre-order website template based on customer anticipation involves several key considerations. Here’s a step-by-step guide.1. Understand your audienceIdentify your target audience and their preferences.Conduct market research to understand what features and products they anticipate.2. Clear product presentationShowcase upcoming products with high-quality images and detailed descriptions.Use visually appealing design elements to capture attention.3. Create a countdown timerBuild anticipation by adding a countdown timer to the launch date.Display the timer prominently on the homepage to create a sense of urgency.4. User-friendly pre-order processSimplify the pre-order process with a user-friendly interface.Include a clear call-to-action button for easy pre-ordering.5. Transparent informationProvide transparent information about the pre-order terms and conditions.Clearly state the expected delivery dates and any potential delays.6. Customer reviews and testimonialsInclude reviews or testimonials from beta testers or influencers, if available.Build trust by showcasing positive experiences with the product.7. Responsive designEnsure your website template is responsive across various devices.Optimize for mobile users to capture a wider audience.8. Email marketing integrationCollect email addresses for updates and notifications.Set up an automated email campaign to inform customers about the status of the pre-order.9. Responsive customer supportHave a dedicated customer support system to address inquiries related to pre-orders.Provide multiple contact channels for customer assistance.Top 3 Examples of Pre-Order Website Templates Impact on Customer AnticipationTechnology sector: iPhone launchApple’s iPhone site features stunning visuals, detailed product information, and a user-friendly interface that seamlessly guides customers through pre-ordering. The countdown timer adds an element of excitement, encouraging users to secure their new iPhones before the official release.Fashion industry: exclusive apparel lineHigh-end fashion brand Louis Vuitton uses pre-order website templates to generate excitement for limited-edition clothing lines. The Louis Vuitton site showcases exclusive collections with enticing visuals, behind-the-scenes content, and an easy pre-order process. Fashion enthusiasts can easily navigate the site and build anticipation for upcoming releases.Entertainment: concert ticketsTicketmaster, a leading ticketing platform, uses dynamic pre-order sites to build anticipation for upcoming concerts. Ticketmaster’s platform integrates interactive elements like virtual seat previews and artist insights to create a compelling pre-order experience. With an easy-to-use interface and personalized notifications, Ticketmaster maximizes ticket pre-sales and increases overall excitement for live events.ConclusionPre-order website templates create a compelling online shopping experience and build customer anticipation.Businesses that embrace pre-order website templates with a customer-centric mindset can reap the above benefits, foster a stronger connection with their audience, and gain a competitive edge in the ever-evolving e-commerce landscape.

Nitrux 3.5.1 Available for Install » Linux Magazine

0
NVIDIA Released Driver for Upcoming NVIDIA 560... » Linux Magazine

Nitrux is one of the more popular immutable, systemd-free Linux distributions, and the developers have announced the latest release available for installation.This latest iteration ships with a Liquorix Linux kernel (version 6.9.7-1), which is designed for low latency compute for audio/visual production as well as to reduce time deviations when playing games.Nitrux 3.5.1 also ships with the latest NVIDIA 555 graphics driver, which includes explicit GPU sync for Wayland. As well, you’ll find the Mesa 24.1 graphics stack included.The NetworkManager received some attention. With this release, dpcpcd is used for DHCP addresses and Dnsmasq for the DNS resolver (instead of the internal DNS client). NetworkManager also handles all network interfaces by default.The Calamares installer received several updates by way of the configuration, such as the prevention of invalid characters for hostnames, TypeError fixed in the Users QML modules, and a reduction of the time characters are visible in the password field when typing a password.Other changes focus on the sysctl configuration, initramfs, ISO size reduction, desktop configuration, the addition of udev rules for brightnessctl, VR devices, Yubikey, and several game controllers, as well as the addition of the lm-sensors hardware health monitoring system.
You can read more about the latest release in the official announcement and download an ISO for installation from the main Nitrux page. Current Nitrux users can follow the steps for upgrading supplied by the developers.   

   

scrcpy 2.0 Adds Audio Forwarding, H265 And AV1 Codec Support

0
scrcpy 2.0 Adds Audio Forwarding, H265 And AV1 Codec Support

scrcpy, a tool to display control Android devices from the desktop, has been updated to version 2.0, receiving a major new feature: audio forwarding. But that’s not all – this release also adds H265 and AV1 video codec support, along with other changes.This is a free and open source application that can be used to display, record and control Android devices connected via USB or wirelessly, from a macOS, Windows or Linux desktop. The application focuses on performance and quality, offering 30~60 FPS with a resolution of 1920×1080 or more, and low latency.Audio forwarding, one of the most requested features, has landed in scrcpy with the latest 2.0 release. This is enabled by default for devices running Android 11 and higher. For Android 10 and older, audio cannot be captured, so this feature is disabled.It’s important to note that for Android 12 and newer, audio forwarding works without any tweaks. For Android 11 though, you’ll need to make sure that the device screen is unlocked when starting scrcpy.Now, when recording your Android 11 or newer device from the desktop using scrcpy, audio is also recorded. If you don’t want to forward audio, disable it by passing the –no-audio command line flag.There are also various command line flags in this release to modify how audio forwarding works. By default, the buffer size is set to 50 ms, but you can change this using –audio-buffer. It’s also possible to change the codec and bitrate (defaulting to Opus at 128Kbps), using –audio-codec and –audio-bit-rate, with the possible codecs being opus, aac and raw (uncompressed PCM 16-bit LE). List the audio encoders available for your device using –list-encoders, then specify the encoder to use with –audio-encoder.Besides audio forwarding, other changes in scrcpy 2.0:Add H265 and AV1 video codec supportAdd –list-displays and –list-encodersFix clicks on Chrome when –forward-on-clicks is enabledRetry on spurious encoder errorMake –turn-screen-off work on all displaysRestore resizing workaround for WindowsUpgrade platform-tools to 34.0.1 (adb) in Windows releasesUpgrade FFmpeg to 6.0 in Windows releases (and use a minimal build)Upgrade SDL to 2.26.4 in Windows releasesYou might also like: Helper GUI For scrcpy, The Android Desktop Display And Remote Control ToolDownload / install scrcpyscrcpy works on Linux, Microsoft Windows and macOS. The application GitHub releases page has binaries for Microsoft Windows only. On Linux, you can install it from your distribution’s repositories or a third-party repository, if available, or build it from source. E.g. the tool is available in the Arch Linux, Debian (and Ubuntu, Linux Mint, etc.) official repositories, however, on Linux distributions like Ubuntu, it won’t be updated to the latest version until a new Ubuntu release is out.You may want to check out my older (but updated today) article on installing scrcpy on Fedora or Debian, Ubuntu, Linux Mint, Pop!_OS, etc. The article includes instructions for building scrcpy from source (and a Copr repository for Fedora).

Kali Installation : Dual Boot VS Live Boot VS Virtual Machine

0
Kali Installation : Dual Boot VS Live Boot VS Virtual Machine

If you are yet to have a Kali instance running on your machine, then you have quite a dilemma ahead of you. There are three ways to go about running Kali, each with their own advantages and disadvantages. In this article, I’ll tell you what exactly the terms Dual Boot, Live Boot, and Virtual machine installation mean, how easy/difficult these are to perform, and what are the advantages/disadvantages of each of them. In the end, I’ll tell you how to find guides for doing all of these.PS: This guide (and the blog) is focused on Kali, but everything in this post is applicable to Linux in general. Certain parts are related to hacking, but you can take networking lessons from them regardless, even if you aren’t interested in hacking per se.
Dual Boot

Most of you would be running a single operating system on your system right now. However, that doesn’t have to be the case. We can partition our hard disk, and install multiple operating systems alongside each other. 

Think of how you have multiple partitions in your Windows (C,D,E,F drives). All your Windows system files would usually be in C (local disk). What if you let go of drive F (copy it’s content to C,D,E first), and decide to install Kali’s system files on it (you can install Kali’s system files on your computer using the .iso file of Kali that is available for download). Now, you will have 3 drives of Windows format (NTFS), and one drive with Linux format (ext4). C drive (NTFS), will have Windows installed, and F drive (ext4, and it’s name isn’t really F drive anymore), has Linux.But since your computer loads the system files during bootup, it needs to know whether to load files from C drive or from the “formerly F” drive. This is handled by the bootloader.

This was a gross oversimplification. Here’s a nice article on HowToGeek that explains stuff in more details.

This is when Kali installer asks where it should install the OS.In the sample explanation, you should install it where the “F” drive ofWindows is. If you instead install it over the “C” drive, you’ll loseWindows, and will only have Kali in your system.

Once you have installed Kali on a system which already had Windows,the bootloader (GRUB) will ask you which of them to boot from.

 USB Boot

In the above example, we had Windows on our C,D,E,F partitions. The C partition had the system files, while D,E,F had other files. We decided to overwrite F and install Kali’s system files over there. When we wanted to run Windows, we booted from C, and when we wanted to run Kali, we booted from the “former F drive” (of course we didn’t know what exactly we are booting for, GRUB handles that for us, we just have to choose).So, can we, instead of installing Kali on our F drive, install it on an external Hard Disk, and then boot from that external hard disk? The answer is yes. Well, you may ask, the size of Kali’s ISO is <4 GB. What if I have a 16 GB USB flash drive. Surely, the installed OS will not take more than 16GB. Why use a hard disk, let me just install the OS on a USB flash drive.

Well, the answer to that is yes too. You can but 10 USB flash drives, and install 10 different operating systems on each of them, and then plug in whichever one you want, boot from it, and if your OS supports the filesystem of your hard disks, you can use your computers hard disks as well. You actually don’t even need hard disks at all. You can run your computer from a flash drive itself. 

However, remember how I said install the OS on the USB flash drive. Turns out, you don’t even have to install the OS. In general, for most software, there is ‘an installer’, and after the installer finishes it’s job, we have the software installed and then can use it. For example, take a simple game. Suppose it has a setup.exe file on the CD drive you bought. When you run that, you can’t yet play the game, and you instead need to install it on your hard disk, after which it can be played. This is true for operating systems as well. If you plug in a Windows installation CD/DVD/USB into your computer, it will do what the name says, install Windows on your computer. Upon installation, you can run Windows.But with some Linux distributions, we have the ability to run the OS without installation(live boot). You can take the ISO, burn it to a DVD drive, and “live boot” it. It will not touch your hard disk, and everything will run directly on your primary memory (RAM). Hence, the installer also acts as the installed software. 

So, simply download Kali Linux’ iso, and copy it to a USB, and you are done. Except for a little problem, USB drives are not bootable by default. So you need a little software which will properly perform the copying of the iso to the USB drive, such that it can be booted from. 

In summary, download the ISO, use a tool to intelligently copy the ISO to a flash drive, plug in the flash drive, and boot from it. It will ask you whether you want to Install the OS, or start running it right away (live boot). Just select the live boot option, and Kali is up and running, without any installation. However, since everything happens in volatile primary memory (RAM), changes are lost. So, everytime you boot into the live USB, it would be like running a fresh install (which can be both a good and a bad thing). With persistence mode, even this limitation is overcome, and you can have changes which persist across boots.

These are the choices offered when you boot from Kali’s installer on a USBYou can run it live, run it live with persistence, or install the OS.

Virtual Machine

Suppose you only have Windows on your machine. How do you go from a powered off system to having a fully functional Windows running on your machine. Actually, a more useful question is, what all do you need to go from nothing to functional OS running. Here are a few things I can think of-

System files that run the OS (or in other words, system files that basically the OS).
A small core utility which can load the system files into memory from the hard disk (bootloader) when the computer is presently in a void like situation.
Memory where the system files are loaded.
Processing power which runs the OS.
Hard Disk space, where you can store stuff, Networking so that you can access the internet, and so on.

So, from a powerless state, in the presence of all the above, we can move to a state where we have a functional Windows instance running on our system. The question I want to ask you is, from a state where we have a functional Windows instance running on our system, can we move to a state where we have two functional OSs running on our system?

The answer should be, why not, if we have all the requirements that can result in a transition from 0 to 1, then if same requirements are met again, we can go from 1 to 2. In other words, if we have-

System files that run the second OS
A different core utility which can load the system files into memory from the hard disk (bootloader) when we have an OS running on the system already (as opposed to being in  a void like situation)
Memory, separate from the already runnning OS’s memory, where the system files of this OS are loaded.
Processing power, separately for this OS, which runs the OS.
Hard Disk space, separately for this OS, where you can store stuff, Networking so that you can access the internet, and so on.

The above discussion should tell you that it would indeed be possible to run multiple OSs together, by somehow dividing the memory, hard disk space, processor power, etc. into two, and letting both OSs run on their share.Without going into too much detail, let me just tell you that using hypervisors, this has indeed been achieved, and now we can run multiple OS inside one OS, given that there are enough resources to sustain the needs of all the simultaneously running OSs. VMware has been a pioneer in this technology, but they only offer limited capability VMWare player for free, while VMWare workstation will cost you. On the other hand, VirtualBox provides free open source products.

Now that you know about all the different ways to run Kali, be it alongside Windows, inside Windows (virtually), or live without installation, let me tell you about advantages and disadvantages of these methods.

Multiple Operating systems can run simultaneously as virtual machines.In the picture, you can see VmWare workstation and various virtual machines on it.

Comparison

Live Boot V/S Dual Boot

Dual boot performs faster than live boot, and has persistence (though live boot with persistence is also available, but that is limited persistence). If you are using live USB, then you have to keep updating the ISO version on the USB frequently (download a new ISO, then write that ISO to the USB). If you have dual boot, then you’ll update Kali the usual way (using apt-get update, upgrade, and dist-upgrade). 

I have put this point of comparison first because this is the only point of difference between live boot and dual boot. The two are identical in every other aspect, and from here on, I’ll use live boot to refer to both live boot and dual boot.

Hardware access

In live booting, when you are running Kali, it would be the sole owner of all the resources that the computer offers (except hard disk space which is occupied by Windows, which is not a major concern). Not only that, it will have access to internal wireless card of your machine. We’ll get a better idea of what hardware advantages we are getting by looking at what we don’t get when we are inside Virtual Machine.

When Kali is running from inside a virtual machine, it doesn’t have access to-

Full CPI / GPU power (because processor needs to be shared between the two simultaneously running OSs) – So, this will mean slower cracking (processor intensive task like cracking WPA-2 4-way handshake will suffer here).
No direct access to internal hardware, only bridged access – What this means for you is that you can’t access the internal wireless adapter of your laptop. So, for wireless hacking, you will need to purchase an external wireless adapter if you are working inside a VM. (even if you are live/dual booting, you may need to purchase an external wireless card, because internal wireless cards are weaker, have less driver support, and sometimes don’t support injection, which is needed in many attacks).

So, for wireless hacking, Virtual Machine isn’t the best way to go.

Networking

In live booting, you are a direct part of the local network you are connected to. In virtual booting, your host computer is a part of that network, and you are part of internal network which contains only you, your host, and other guests. 

First, let me explain some technical jargon-

Internal network – When you connect to your wifi router, you, along with other connected devices (your iphone, android phone, macbook, PC, etc.) become part of a local network. The internet knows only about your router. Every communication must be sent via the router to the internet, the internet will respond to router, and router will return the response to the appropriate system on the local network.
VMnet – This is an equivalent of internal network, with the guest virtual machines, and the host machine a part of it.
Host machine – The machine on which Vmware/virtualbox is installed, and inside which the virtual machines are running.
Guest machine – The machines inside virtualbox/vmware.
Internal IP – Your IP on the local network
VMnet IP – Your IP on the Virtual network (VMnet) [This is not a standard term, internal and external IPs are standard terms, this I’m using for convenience]
External IP – Your IP on the internet. 

If any of the machine make a request to the internet, their external IP would be the same. To check this, open your smartphone, and search “Whats my IP on google”. Repeat this from all your other devices connected to the same router. Each one will have the same IP. Internally, all the devices have a different internal IP (the router has an internal IP too, like any other device on the local network).

Similarly, when you send a request from any of the VM guests to a machine outside the VMNet, but inside the local network, you’ll carry the internal IP of your VM host (i.e. the Windows machine). Internally, all the guests have a VMnet IP (the host has one too, and inside the VMnet, behaves like guests).

Let me explain this a bit further with pictures.

Here, the kali machine is a part of VMNet, and can’t directly contactthe mac machine and android machine. To reach them, it has to go via the Windows machine.The router doesn’t know about the existence of Kali Machine (or the Windows XP machine).The path to the internet involves both the host machine, and the router. 

Here, Kali is directly a part of the Local network. Here, the router knows about the Kali Machine.Also, the path to the internet involves only the router.

So, what does this mean for us?

If you want to practice penetration testing, VMs can be great. You can have a Windows host, and Kali running as a virtual machine. Alongside, you can have Windows XP running as another guest VM. Now, these are a part of VMNet and directly connected. So, you can easily perform any attacks from Kali to this machine.
If you want to do real life pentesting, your target is probably over the internet. In that case, having Kali inside a virtual machine doesn’t help. Firstly, even if you are live booting Kali, you are a part of the local network, and to communicate with your target over the internet, you need to “forward” your requests through the router (this is called port forwarding). This, in itself, can sometimes be a pain in the ass. If you are inside a VM, your path to your target would involve your router, your host machine, and then the Kali Machine. This is quite inconvenient. So, if you want to attack someone over the internet, being in a virtual machine sucks.

In other words, your guest machine (Kali) does not have access to your laptop’s network card. It has bridged access to it. In theory, you can still use most of the functionality of the card, but in practice, it’s a painstakingly hard job. You can, however, add an external card and give it to the Kali guest instead of the windows host, mitigating this problem. Read the food for thought below for more-

Food For Thought

When you are inside a virtual machine, you are using your host to connect to the internet. But that doesn’t have to be the case. You can plug in an external wireless card, and connect to the router directly. That would mean, that you are now a part of VMNet, as well as a part of LAN (your wlan0 card gets allocated an internal IP on the LAN (WLAN), say 192.168.1.5. Now, you don’t need your host for internet access, and as far as the router is concerned, you are a separate computer. So, this does solve the problem that being inside a virtual machine causes. (I’m too lazy to draw a diagram for that, but in this case, the diagram will have Kali as a part of both the internal network dotted box, and the VMnet dotted box. This is exactly equivalent to the condition Windows 8/10 machine in the first diagram. It will also have two IPs, one for VMnet, and one for LAN).

Ease/Risk

Live boot is the easiest to perform, and the least risky.

Virtual machine is a bit harder, but still not risky.

Dual boot is tough, and you run the risk of losing your data/ getting rid of your original OS, etc.

Also, sometimes Dual Booting can be next to impossible. For example, some laptops with Microsoft signature (the 2-in-1, laptop+tablet types usually) addition don’t let you dual boot anything alongside Windows.

Forensics

Live booting doesn’t leave behind many traces, other two methods do.

How to find installation guides

For finding guides, keep the following pointers in mind-

Consult multiple resources before doing anything. There are thousands of guides for installing Kali, and there’s no ‘best’ guide.
Make sure to read the official documentation.
Make sure not to limit yourself to just written tutorials, or just YouTube videos. Both has their own advantages and disadvantages. 
Consult tutorials for your precise versions of software (how to install Kali Rolling alongside Window 10), not simply Kali alongside Windows. There are only a few minor difference across the various releases, and their install instructions, but when you’re doing it for the first time, these minor differences are important.
Live USB is the easiest, go for it first. Go for Virtual machine if you’re interested in practicing Penetration Testing. 
Even the easiest method, Live USB, isn’t trivial. If you’re a beginner, even that will require some efforts (changing boot order/ choosing USB as boot device, finding a proper software for making bootable USB, etc.). Don’t get discouraged.

Extra Advice

For wireless hacking, don’t even think about anything, go for live boot, it’s a no brainer.
For pentesting, when you’re just getting started and need to practice on local targets, go for Virtual machine.
When you’re comfortable with Linux, and feel that you can use Kali for usual stuff, only then install Kali alongside Windows. Still, I won’t suggest using Kali as your primary OS.
If you love Linux, and love challenges, then install Kali as your primary OS. If you do, see if you’re able to figure out  how to install Skype on Kali rolling release (if you succeed, please let me know. I haven’t been able to do it so far, and anyways, skype web works fine).

The last point tells me that I’m getting carried away now, and this post needs to come to and end. Hope you learnt a lot. Let me know if you feel that there’s something important worth inclusion that I missed.

How to Install and Use wget on CentOS

0
How to Install and Use wget on CentOS

In this tutorial, we’re going to show you how to install and use wget on CentOS. We’ll include useful and practical examples of the wget command.

This tutorial will work on CentOS Stream, CentOS 8, RHEL, AlmaLinux, Rocky Linux, and others.
All the wget command examples will work on all distros, including Ubuntu and Debian.
How to fix the “-bash: /usr/bin/wget: No such file or directory” error
In some cases, wget may already be installed on your CentOS system. If it’s not installed, you’ll get the “no such file or directory” error. To fix it, you just need to install wget. Follow the instructions below to install wget.
How to install wget on CentOS
You can install wget with this single command:
sudo yum install wget -y
And now you can start using it without getting an error.
Tip: if you’re using Fedora or a newer version of RHEL/CentOS, just replace “yum” with “dnf”, if using Ubuntu/Debian, replace “yum” with “apt”
How to use wget on CentOS (basic)
The basic syntax of the wget command is:
wget [OPTIONS] [URL]
Here’s a basic example:
wget https://linuxstans.com/wp-content/uploads/2020/10/LinuxStansIcon512.png
This command will download the PNG file to the directory you’re currently in.
If the URL contains special characters ( * ? ] [ ), then you just need to put the URL in quotes. So as a safety measure, you can always put the URLs in quotes when using wget, regardless if they have a special character or not.
We’ll show you more specific useful examples below.
How to save the file with a different name
When downloading a file, you can save it under a different file name. To do that, you should use the “-O” option:
wget -O logo.png “https://linuxstans.com/wp-content/uploads/2020/10/LinuxStansIcon512.png”
This command will save the “LinuxStansIcon512.png” image as “logo.png”
How to save the file in a different directory
If you don’t want to download the file in the directory you’re currently in, you can use the “-P” option to specify a different directory. For example:
wget -P /var/www/logos/ https://linuxstans.com/wp-content/uploads/2020/10/LinuxStansIcon512.png
This will download the “LinuxStansIcon512.png” image to the /var/www/logos directory.
How to download a full website
You can also use the wget command to download a full website. To do that, you should use the “-m” option:
wget -m https://linuxstans.com
This command will download the whole Linux Stans website. This is useful if you often use your computer offline, and still want to read whatever a website has published. The -m option creates a mirror of the website you specify.
How to run the downloading in the background
If it’s a larger file, the downloading may take longer. So to run the wget downloading process in the background and continue using your system, you need to use the “-b” option. One example is when downloading a full website. So you can combine both options like:
wget -m -b https://linuxstans.com
This command will download the full website in the background.
How to use wildcards to download all files in a certain FTP directory
One useful example when using the wget command to download FTP files is using a wildcard. For example, to download all PNGs from an FTP directory, you can use the * wildcard, like below:
wget ftp://linuxstans.com/logos/*.png
This will download all PNG files from the logos folder from the LinuxStans FTP server.
How to download password-protected files
Some files may have password protection. To download them, you need to include the HTTP username and password in the wget command, like so:
wget –user=linuxstans–ask-password https://linuxstans.com/wp-content/uploads/2024/03/ProtectedFile.zip
This command will use “linuxstans” as the username and you will be prompted to enter the password after you enter the wget command. You can also enter the password directly in the wget command, but it’s less secure because the password will be visible to other users.
How to download multiple files from a .txt file
If you had a text file with multiple URLs (one URL per line), for example:
cat urls.txt

https://linuxstans.com/wp-content/uploads/2024/03/file1.zip
https://linuxstans.com/wp-content/uploads/2024/03/file2.png
https://linuxstans.com/wp-content/uploads/2024/03/file3.jpg
https://linuxstans.com/wp-content/uploads/2021/07/randomFile.txt
You can use the “-i” option to download all the files from that text file, like this:
wget -i urls.txt
How to download multiple files within wget
If you don’t want to use a .txt file and you have multiple URLs you want to download at once, you can use \ to separate them, for example:
wget https://linuxstans.com/wp-content/uploads/2024/03/file1.zip \
https://linuxstans.com/wp-content/uploads/2024/03/file2.png \
https://linuxstans.com/wp-content/uploads/2024/03/file3.jpg \
https://linuxstans.com/wp-content/uploads/2021/07/randomFile.txt
This will also download all 4 files.
And more!
There are many more useful examples of how to use the wget command. You can go through the help page to get an idea of what other options are available:
wget -help
How do you use wget? What’s a useful and practical example you can share?

How to Make Docker-Compose to Always Use the Latest Image

0
How to Make Docker-Compose to Always Use the Latest Image

By default, docker compose does not always pull the latest version of an image from a registry. This can lead to using outdated images and potentially missing bug fixes or new features. This tutorial will discuss various methods to make Docker Compose always use the latest image.Method 1. The –pull FlagThe simplest way to ensure Docker Compose pulls the latest image is to use the –pull flag when running docker-compose up:docker-compose up –pull always
This flag forces Docker Compose to attempt to pull the latest versions of all images specified in your docker-compose.yml file before starting or restarting services.Method 2. The image: tag StrategyIn your docker-compose.yml file, you can specify the image you want to use along with the latest tag: services:
redis:
image: redis:latest
ports:
– “6379:6379”
– “16379:16379”
volumes:
– redis-data:/data
– ./redis.conf:/usr/local/etc/redis/redis.conf

volumes:
redis-data:
This seems very easy but it’s important to note that using latest doesn’t always guarantee that you will get the absolute latest image. For example, if you had previously pulled an image tagged as latest, Docker might use the cached version unless you explicitly tell it to pull again (using the –pull flag). The best way to handle this is by first stopping and removing existing containers and images, then pull the latest images from the registry, and finally starts the containers in detached mode (-d), rebuilding them if there are changes in the Dockerfile (–build):docker-compose down –rmi all
docker-compose pull
docker-compose up -d –build
Method 3. Build Images LocallyAnother comman way to make docker-compose use the latest image is by building images locally using a Dockerfile. This way you are using the latest code by rebuilding the image before running docker-compose up:docker-compose build –no-cache
docker-compose up
The –no-cache flag tells Docker to rebuild the image from scratch, incorporating any changes you have made.Method 4. Use WatchtowerWatchtower is a utility that runs as a Docker container itself. Its primary function is to monitor other Docker containers on the same host and automatically update them to newer image versions when they become available.Watchtower is easy to set up. You can either run it as a standalone container: docker run -d \
–name watchtower \
-v /var/run/docker.sock:/var/run/docker.sock \
containrrr/watchtower
Or integrate it into your docker-compose.yml file: services:
redis:
image: redis:latest
ports:
– “6379:6379”
– “16379:16379”
volumes:
– redis-data:/data
– ./redis.conf:/usr/local/etc/redis/redis.conf

watchtower:
image: containrrr/watchtower:latest
volumes:
– /var/run/docker.sock:/var/run/docker.sock
command: –schedule “0 4 * * *” –cleanup –stop-timeout 300s

volumes:
redis-data:
Using the latest image is not always the best practiceWhile using the latest image might seem ideal, it’s important to understand that it is not always the best approach as latest tag can be unpredictable. The image it refers to might change unexpectedly, introducing unforeseen issues into your environment or new image versions might contain breaking changes.That’s why it’s important to follow the best practices:Specific Tags: Instead of relying on latest tag, read the changenotes and use specific image tags. For example, redis:7.2.5. This gives you more control and predictability.Regular Updates: Establish a schedule for updating your images to benefit from bug fixes and new features while minimizing the risk of unexpected issues.Testing Environments: Always test new image versions in a staging environment before deploying them to production especially when you are using a tool like Watchtower or any other tool which automaically updates the images.Wrapping UpIt’s quiet easy to make Docker Compose use the the latest image easier, however, it’s important to follow the best practices to avoid breaking things and maintain a balance between staying up-to-date and maintaining the stability.

Linux One Liner :: Organizing for 2024 at the command line

0
Linux One Liners

Yes, you can use the power of the command line to help you to get ready for 2024. What I wanted was to make a file with all of the months in a few places (My overall schedule, Bills and Financial and such). I did it with one command! Let me walk you through it.

 
Here are the steps:

I went into my home directory and then into Documents.
I made a new directory named 2024
Here is the “magic command” to create directories for all the months at once

(IFS=’;’; mkdir $(locale mon))

With a simple ls command we can see that all the moth directories are all there!
I can copy the 2024 directory anywhere I need it.

You can see more Linux One Liners – Here!
I found a great site that helps break down complex commands like the ones I use that you will love – Explain Shell