Home Blog Page 16

UBIFS File-System Being Hardened Against Power Loss Scenarios

0
Nokia N900, Nokia originally developed UBIFS

While most Linux file-systems are rather robust in recovering when the system experiences a power loss, the UBIFS file-system is more prone to problems when a power-cut happens. With patches submitted for the Linux 6.11 merge window, UBIFS is seeing some hardening so it can better cope with the loss of power.
The Unsorted Block Image File System (UBIFS) for flash memory devices can run into problems and have an inconsistent file-system state if the power is unexpected lost. Fortunately most UBIFS use on smartphones or other devices are battery-backed and can safely power down in advance of the battery being exhausted, but with Linux 6.11 are some hardening improvements for UBIFS to deal with this situation.
A set of nine patches are part of the UBIFS pull request for Linux 6.11 to tackle some of the inconsistent problems that can happen when there is a powercut. These fixes and hardening make the flash file-system more robust, but does come with some small performance costs. Sequential writes in testing by developer Zhihao Cheng led to a drop from 412MB/s to 409MB/s, the FS-Mark benchmark dropping from 7131 files/s to 7090 files/s, and other small performance drops noted as a result of these patches. But a small performance cost is better than potential data loss in the event of unexpectedly losing power.
This better UBIFS handling of power-cut situations was sent in as part of the UBIFS pull request for the Linux 6.11 merge window set to end this weekend.

New Linux ‘Screen of Death’ Options: Black – or a Monochrome Tux Logo

0
Linux

It was analgous to the “Blue Screen of Death” that Windows gives for critical errors, Phoronix wrote. To enable error messages for things like a kernel panic, Linux 6.10 introduced a new panic handler infrastructure for “Direct Rendering Manager” (or DRM) drivers. Phoronix also published a follow-up from Red Hat engineer Javier Martinez Canillas (who was involved in the new DRM Panic infrastructure).

Given complaints about being too like Microsoft Windows following his recent Linux “Blue Screen of Death” showcase… Javier showed that a black screen of death is possible if so desired… After all, it’s all open-source and thus can customize to your heart’s content.
And now the panic handler is getting even more new features, Phoronix reported Friday:

With the code in Linux 6.10 when DRM Panic is triggered, an ASCII art version of Linux’s mascot, Tux the penguin, is rendered as part of the display. With Linux 6.11 it will also be able to handle displaying a monochrome image as the logo.If ASCII art on error messages doesn’t satisfy your tastes in 2024+, the DRM Panic code will be able to support a monochrome graphical logo that leverages the Linux kernel’s boot-up logo support. The ASCII art penguin will still be used when no graphical logo is found or when the existing “LOGO” Kconfig option is disabled. (Those Tux logo assets being here.) This monochrome logo support in the DRM Panic handler was sent out as part of this week’s drm-misc-next pull request ahead of the Linux 6.11 merge window in July. This week’s drm-misc-next material also includes TTM memory management improvements, various fixes to the smaller Direct Rendering Manager drivers, and also the previously talked about monochrome TV support for the Raspberry Pi. Long-time Slashdot reader unixbhaskar thinks the new option “will certainly satisfy the modern people… But it is not as eye candy as people think… Moreover, it is monochrome, so certainly not resource-hungry. Plus, if all else fails, the ASCII art logo is still there to show!”

DeepFake software – Can it bypass identity verification?

0
Sohail

In 2002, a Japanese researcher named Tsutomu Matsumoto demonstrated how simple methods could trick a fingerprint sensor. He used a Gummy Bear candy to create a copy of a fingerprint obtained from a glass surface.His handmade fake fingerprint successfully fooled the sensor in 4 out of 5 attempts, highlighting vulnerabilities in biometric security systems.Let’s see: when these software tools are cleverly combined with deepfake and other plugins, they can generate all the confidential data required to circumvent identity verification which makes any internet user vulnerable to identity theft and fraud.What’s even more alarming is that the attacker may not even be directly connected to you. They can simply feed photos and videos from your social media accounts into these software tools to produce more realistic images and videos for use in live detection and identification.What is the deep fake effect on identity verification?As of April 2023, one-third of all businesses reported experiencing video and audio deepfake attacks. Latin America also experienced a 410% surge in deepfake usage for identity fraud and globally, there has been a 10X increase of deepfake use between 2022 and 2023.The simple truth is that deepfakes and generative AI technology have made all identity verification models vulnerable to attack, it has given rise to multiple numbers of unreal people, fake account access, impersonation, identity theft, scams, and fraud.Methods hackers use to by-pass identity verificationSpoofingThe use of spoofing to deceive users and perpetrate cybercrimes has evolved beyond simple tactics like altering letters in emails or website addresses. Now, it encompasses sophisticated techniques such as manipulating human faces, cloning voices, and even passing live detection with realistic video gestures.For companies prioritizing security, especially to comply with AML laws, it’s crucial to grasp how these methods are employed by malicious actors.Using images from the internetTo verify the authenticity of a user, hackers can readily gather images of their victims from social media or other sources. In some cases, they employ photo editing software to manipulate these images to suit their needs. For example, in authentication processes that require users to hold up an ID card or another form of legal identification, malicious actors can easily obtain photos from Facebook and swap faces through softwares like Deepswap, without the victim’s knowledge.High-end edited/ pre-recorded videosRandomly, any active social media user might unwittingly fulfill the basic requirements needed for simple verification technology. Actions like smiling or closing and opening one’s eyes can be used to deceive facial recognition systems that lack sophistication. Although many facial recognition systems require live videos, hackers can resort to unethical methods, such as using pre-recorded videos, to bypass these systems.Akool deepfake can be used for video face swap and also add up facial gestures which illicit users can use to compromise identity systems.The use of synthetic masks: This is a common method of spoofing employed by attackers, and it often requires double checks or highly trained models to effectively detect.Machine learning models are typically trained on high-quality images of faces to determine whether the user’s face is genuine or not. However, attackers can exploit factors such as poor lighting conditions to deceive the system into identifying a synthetic mask as the legitimate owner of the account.Models with limited training data may also not be sufficiently trained to accurately differentiate between real and synthetic faces.Deep fakesThis is one of the fastest methods of identity theft, as attackers only need minute details to gather all the information necessary to validate their victim’s identity. In 2024, generative AI is at the forefront of advancing businesses into new horizons. However, it’s also a fact that hackers and fraudsters are actively seeking ways to leverage these cutting-edge technologies for malicious purposes.This underscores the importance for companies to heavily invest in security technologies and consistently update their tech infrastructure to avoid falling victim to such attacks. Safety measures against deepfakesThe world is already experiencing a tenfold increase in deepfake usage, signaling that any company, regardless of size, could be the next target of unidentifiable or unwanted users. For companies adhering to KYC and AML regulations, staying ahead of the game is essential. Here are a few tips:Robust identity document checksGiven that any legal document can be easily forged, it’s imperative to conduct thorough scrutiny of each document submitted. To combat the prevalence of deepfake usage, companies can consider implementing dedicated training models integrated into their identity verification processes.Detailed KYCKYC checks shouldn’t stop at ID cards. If you’re a fintech product, consider mapping out strategic questions and approaches that delve deeper into verifying user identities. This gives a prior understanding of what the customer’s financial behavior should look like, also it will effectively reveal lapses and inconsistent details.Ongoing monitoring of usersEvery illicit user has an objective and a motive behind their actions, emphasizing the need for consistent monitoring. This means that every behavior is logged, and in cases of uncertainty, approaches can be developed to counteract these behaviors. For example, Twitter, a social media platform notorious for bot activity, has seen various measures implemented to combat bots, such as the Arkose challenge that suddenly appears during usage.ConclusionRegulations are certainly being implemented to ensure the safe use of these technologies. However, the harsh reality is that bad actors will continue to find ways to exploit vulnerabilities. At this point, everyone is prone to being vulnerable, and the best course of action is to take extra precautionary steps.Brands can also start employing third-party identity verification companies equipped with counter AI deployment tools and additional scrutiny features to continuously safeguard their brands.

SoftMaker FreeOffice: A cross-platform Office suite that’s fully compatible to MS Office

0
SoftMaker FreeOffice: A cross-platform Office suite that’s fully compatible to MS Office

Most Linux users are well-acquainted with LibreOffice – many distributions have it pre-installed. Fewer know its powerful alternative: FreeOffice is a full-fledged office solution with full support for Microsoft Office file formats. It consists of a word processor, a spreadsheet and a presentation program. True to its name, FreeOffice is fully free and available for Linux in 32-bit and 64-bit versions.FreeOffice is far from a LibreOffice clone. The software is being developed by a German software company with a history going all the way back to 1987. Due to its background, FreeOffice has far more in common with Microsoft Office than with LibreOffice.FreeOffice’s crucial advantage over Microsoft Office is its cross-platform availability. The suite is available for Linux, macOS and Windows.Instead of packing all its functionality into a single application à la LibreOffice, FreeOffice is split up into three applications: TextMaker, PlanMaker and Presentations. This has the advantage of faster start-up times and that each application already launches with an empty document of the proper type: a text document, a spreadsheet or a presentation.The most salient feature is full Microsoft Office format compatibility. FreeOffice opens and saves Microsoft Word documents (doc, docx), Excel spreadsheets (xls, xlsx) and PowerPoint presentations (ppt, pptx). FreeOffice will process even the most intricately-formatted Microsoft documents and save them without loss of formatting.FreeOffice users can choose between a classic, menu-based interface and a ribbon in the style of current Microsoft products. The application can be used either with a dark or light theme. The optional ribbon follows Microsoft’s lead far closer than LibreOffice’s Notebookbar. Users trying out the ribbon interface can always access the traditional menu structure through a hamburger menu button just below the ribbon.TextMaker is a full-fledged word processor. Its interface can be enhanced with an optional sidebar which displays either a WYSIWYG overview of all paragraph styles and character styles or helpful usage tips. The sidebar is used to select styles as well as to manage and update them. For spelling and hyphenation, FreeOffice relies on the trusted open-source Hunspell dictionaries.Apart from Microsoft Office files, TextMaker also opens and saves OpenDocument files and handles RTF, HTML and text documents. The change tracking feature is fully compatible with Microsoft Word – notes and changes created in Word can be processed in TextMaker and vice versa.PlanMaker is the suite’s spreadsheet. Here, the sidebar provides options to manage pivot tables. File format support includes Symbolic Link (SYLK), RTF, HTML, dBASE, DIF and CSV – and, of course, all Excel formats.The presentations application is appropriately called Presentations. Its sidebar provides direct access to slide layouts, designs, color schemes, backgrounds, transitions and animations. Animations and transitions are OpenGL-based. Aside from supporting PowerPoint formats, slides can also be saved in RTF documents and exported as PDFs or images.FreeOffice can be downloaded through several repositories – an online guide explains how to install the suite in Ubuntu, Debian, openSUSE, Linux Mint, Fedora and other distributions. Permanent use requires a product key which is requested from within the application – the key arrives instantly by e-mail.SoftMaker’s main product SoftMaker Office is an expanded version of FreeOffice. The commercial version expands on FreeOffice’s robust basic toolset with a number of productivity features. SoftMaker maintains a comprehensive comparison of the differences between the free and commercial versions of its applications.SoftMaker Office offers additional features such as customizable ribbons, high-quality commercial dictionaries and thesauri as well as advanced file backup with version management. The commercial version of TextMaker includes additional tools to manage large documents, such as an outliner mode and options for cross-references and bibliographies. A style manager can be used to transfer paragraph styles and character styles from one document to another.The commercial version of PlanMaker includes input validation, data transposition and consolidation and interactive forms. The version of Presentations included in SoftMaker Office features a presenter view, photo albums, summary slides and charts. Users who upgrade to SoftMaker Office using the in-app purchase button in the upper right corner are offered a significant discount to the regular price.FreeOffice is a free download for Linux, Mac and Windows at https://www.freeoffice.com.

How business leaders leverage Data Science? – NoobsLab

0
How business leaders leverage Data Science? - NoobsLab

In the dynamic landscape of today’s digital world, the heartbeat of successful businesses resonates with data. This unprecedented surge in information has given rise to an invaluable instrument: Data Science. This multidimensional discipline, interweaving statistics, machine learning, and domain expertise, has the remarkable ability to distill priceless insights from intricate datasets.

Visionary business leaders recognize Data Science as a potent catalyst, utilizing its prowess to make calculated decisions, forecast trends, and attain a formidable competitive advantage.

The Data Science Advantage
Data Science transcends the world of mere numerical manipulation and taking a Data Science Training is advantageous. It’s an art of deciphering narratives hidden within those numbers. Business leaders harness this unique advantage to unveil patterns and trends that inform their strategies. Imagine steering a retail enterprise. By dissecting buying behaviors and demographic nuances, you can customize your product lineup to harmonize with customer preferences. This tailored approach elevates customer satisfaction and propels sales figures. Through Best data science courses, professionals and aspiring learners can acquire the skills necessary to navigate the complexities of data and contribute significantly to their organizations’ success.

Empowering Informed Decision-Making
Gone are the days of relying solely on intuition to navigate decisions. Data Science empowers leaders to craft informed choices substantiated by empirical evidence. Through meticulous analysis of historical data, prognostic modeling, and intricate algorithms, they can anticipate seismic shifts in markets and gauge customer appetites. For instance, envision a marketing czar fine-tuning the most effective advertising channels through a retrospective dissection of past campaign triumphs. This method obliterates guesswork, ushering in a new era of maximized return on investment.

Elevating Customer Experience
A cornerstone of thriving businesses is their unwavering commitment to customer-centricity, a commitment that Data Science magnifies. Positioned at the forefront, Data Science illuminates the intricate pathways of customer behavior, forming the bedrock for crafting tailor-made experiences. By meticulously analyzing customer feedback, observing browsing patterns, and delving into transaction histories, leaders meticulously curate personalized recommendations and communications.

This elevated tier of customization forges unbreakable links between businesses and their customers, nurturing a bond of loyalty that goes beyond the ordinary. As Data Science unfurls its capabilities, leaders find themselves armed with insights to not just meet but exceed customer expectations, embarking on a journey that not only enhances the bottom line but also etches the brand’s legacy in the annals of exceptional customer engagement.

Optimization of Operations
The ethos of efficiency pervades the echelons of business success. Data Science galvanizes leaders in fine-tuning operations by spotlighting bottlenecks and inefficiencies within processes. Imagine a supply chain steward meticulously poring over production data and logistical intricacies to streamline workflows, slash expenditures, and orchestrate punctual deliveries. This orchestration augments not only the financial bottom line but also the overarching business reputation.

Mitigating Risk
In the tapestry of business endeavors, risk is an ever-present thread. Data Science dons the mantle of a safeguard, enabling leaders to assess potential hazards and contrive mitigation blueprints. Financial institutions, for instance, employ predictive models to evaluate the creditworthiness of borrowers, effectively reducing the peril of defaults and underpinning sound lending protocols.

Pioneering Innovation and Product Evolution
At the heart of business growth lies innovation, a catalyst that Data Science has the potential to fuel. By delving into market trends and embracing consumer feedback, leaders are empowered to uncover unaddressed needs, allowing them to sculpt products that resonate. This proactive approach not only mitigates the risk of product missteps but also cultivates an environment of perpetual innovation, breathing life into a thriving ecosystem of continuous improvement and forward-thinking evolution.

By harnessing the power of Data Science, business leaders can stoke the flames of creativity, sculpting offerings that not only satisfy existing demands but also pave the way for uncharted realms of consumer delight, all while solidifying their standing in a fiercely competitive landscape.

Confronting Hurdles
While the realm of Data Science presents a constellation of benefits, it’s not devoid of challenges. Business leaders find themselves navigating through hurdles that require strategic finesse. Among these challenges, the sanctity of data privacy and the assurance of data quality stand as paramount concerns. In a landscape where data breaches can have far-reaching consequences, safeguarding customer information becomes non-negotiable

Additionally, the pursuit of skilled Data Scientists emerges as a modern-day quest, as these professionals are the architects of insights that drive progress. And yet, the journey doesn’t end with data acquisition; the intricate outputs of data analysis demand a nuanced understanding to prevent misinterpretation. In this intricate dance between potential and pitfalls, business leaders act as navigators, employing a multidisciplinary approach that combines technical expertise, ethical considerations, and strategic foresight.

By steering through these challenges with wisdom and adaptability, they not only harness the potential of Data Science but also ensure a secure and prosperous voyage toward data-driven success.

Bridging the Chasm: Collaborative Synergy
To harness the full spectrum of Data Science’s potential, collaborative synergy between business leaders and Data Scientists is a sine qua non. Business leaders infuse domain expertise and articulate the problems that warrant resolution, while Data Scientists contribute their analytical acumen. This harmonious partnership guarantees that data-driven revelations harmonize seamlessly with business objectives.

Charting the Path Forward
In tandem with technological progress, the panorama of Data Science continues to evolve. Business leaders must remain attuned to the vanguard of trends and tools to perpetuate their competitive edge. Embracing automation, machine learning, and AI-driven insights will constitute the bedrock for staying at the forefront of this evolutionary tide.

Take a pause and look at this Data Science Course to increase your knowledge!

In Summation
In an era characterized by data’s dominance, Data Science transcends the realm of mere terminology; it emerges as a strategic imperative. Business leaders who harness its might engender an unequivocal advantage over rivals. From shaping calculated decisions to elevating customer encounters and fanning the flames of innovation, Data Science serves as the North Star guiding modern business triumphs.

As impediments mutate and technology strides forth, the symbiotic interplay between human expertise and data-steered insights will sculpt the visage of the business future. Hence, if you find yourself navigating the corridors of business leadership, regard Data Science not as a mere implement but as a metamorphic force capable of steering your enterprise toward uncharted horizons.

Celebrating Ethernet’s 50th Anniversary – Pixelated Dwarf

0
Ethernet @ 50

On May 22, 1973, a groundbreaking technology was born that would forever revolutionize the way we connect and communicate. Ethernet, the underlying technology facilitating the transfer of data across networks, has now reached its momentous 50th anniversary. It was invented by Robert Metcalfe, David Boggs, Chuck Thacker, and Butler Lampson at Xerox Corporation’s Palo Alto Research Center (PARC).
This article celebrates 50 years of Ethernet!
Why is this still important now?
The main reasons are: Ethernet is the backbone of the internet and it is still being used after all these years. Many improvements have been made over time to make things faster and better, but the idea of sending packets of information over a wire remain the same. It is amazing that any fifty-year old technology has survived, let alone be in current use. That is the genius of the concept of Ethernet.
The birth of Ethernet
For those of you reading that were not born yet, let me put things into perspective: Computers in the 1950’s and 1960’s were massive. The took up floors of buildings. There were no personal computers at that time.
As corporations and government agencies acquired more of these computers they wanted them to “talk” to each other and share information. Starting in the 1960’s ALOHAnet used a technique called “Carrier Sense Multiple Access with Collision Detection” (CSMA/CD) over the air. This technique is still being used. The interoperability of many types of devices on a single communications wire became possible.
The Evolution of Ethernet
I remember the phrase from an old commercial: “You’ve come a long way baby!”. Ethernet has gone through many hurdles in speed.

1973 – Ethernet speed is only 2.94 Mbps over coax.
1980’s – IEEE Standard for Ethernet – 10 Mbps.
1990’s – 100 Mbs (Fast Ethernet)
2000’s – 1 Gbps (Gigabit Ethernet)
2023 – 10 Gbps and beyond

Ethernet has had a significant impact on the modern internet in various ways:

Speed and Bandwidth: Ethernet technology has substantially increased internet speeds and bandwidth. Initially, Ethernet ran at speeds of 10 Mbps, but with advancements like Fast Ethernet (100 Mbps), Gigabit Ethernet (1 Gbps), 10 Gigabit Ethernet (10 Gbps) and beyond, it has enabled faster data transmission and increased network performance.
Standardization: Ethernet has become the de-facto standard for wired LAN (Local Area Network) connections. Its clarity, reliability, and compatibility across different devices, operating systems, and vendors have contributed to the widespread adoption and interoperability of Ethernet-based networks.
Scalability: Ethernet has allowed for network scaling and growth. Due to its flexible design, Ethernet can accommodate both small office networks as well as large enterprise networks, enabling the integration of multiple devices, such as computers, servers, routers, switches, and more.
Internet Backbone: Ethernet has played a crucial role in building the internet backbone. As high-speed Ethernet connections became prevalent and affordable, it became feasible to establish faster and more reliable connections between internet service providers (ISPs), data centers, and other network infrastructure components, paving the way for faster internet connections globally.
Internet of Things (IoT): With the rise of IoT devices, Ethernet has offered a reliable and secure means of connecting these devices to the internet. Ethernet connectivity enables seamless integration of IoT devices into existing networks, facilitating data exchange, control, and management.
Multimedia Streaming: Ethernet’s higher speeds have been instrumental in supporting multimedia streaming services like video streaming, online gaming, and video conferencing. With faster Ethernet connections, users can access high-quality media content, interact with others via video calls, and enjoy lag-free gaming.

Conclusion
I used to teach a class about the history of computing at a couple of community colleges. I loved teaching those classes. Ethernet was “born” out of ALOHANet and the CSMA/CD way that they were transferring data. The evolution of  a simple diagram (that was drawn on a coffee stained napkin) to ethernet today is amazing .
This month we celebrate Ethernet turning 50 years old. For any technology to stand up to and change for the better is a testament to the brilliance of the initial idea. I am happy to have been there for the milestone.

10 Things to Do After Installing Fedora 40 (Workstation)

0
10 Things to Do After Installing Fedora 40 (Workstation)

We are presenting our traditional Fedora release article – “10 Things to Do After Installing Fedora 40”, with post-install tweaks.

In this article, we will talk about a few post-install tips for Fedora 40 workstation edition. These are a good starting point if you are installing a fresh Fedora 40 workstation edition for all user base.

Here are the ten things which you can do after installing Fedora 40.

10 things to do after installing Fedora 40 (GNOME Edition)

Making dnf a little faster

The dnf package manager may feel a little slow as per the default configuration. And the upcoming dnf5 is yet to land in Fedora (currently planned for 41). Hence, you can make the following changes at the beginning, to make it a little faster.

Open a terminal window and open the dnf configuration file via the default text editor.

sudo gnome-text-editor /etc/dnf/dnf.conf

Add the following line at the end of the file and save/close. This allows dnf to download that many packages in parallel. You can use any value from 3 to 20.

max_parallel_downloads=10

Switch to a faster dnf mirror

In addition to the above change, in the same dnf configuration file, add the following line at the end:

fastestmirror=True

Save and close the text editor. This is sufficient to make the dnf app performance faster.

dnf configurations

Update your system

Once you make the above changes, it’s a good idea to refresh your system. This is to ensure that you have all the latest packages and modules before you start using them or making further changes.

To do that, you can open the Software app and hit check for updates.

Or, I would recommend you open a terminal and run these simple commands.

sudo dnf upgrade

Upgrading Fedora 40

Enable RPM Fusion

I recommend enabling the RPM Fusion repo since it provides additional packages (including non-free ones). It would help with the future installation of several applications (such as Java, JRE, FFmpeg, etc.). The RPM Fusion is a community-contributed repo, a collection of non-free and additional packages Fedora Linux cannot ship in its official ISO file due to license and other terms.

To enable RPM Fusion in Fedora 40, open a terminal and run the following commands in sequence.

sudo dnf install https://download1.rpmfusion.org/free/fedora/rpmfusion-free-release-$(rpm -E %fedora).noarch.rpm

sudo dnf install https://download1.rpmfusion.org/nonfree/fedora/rpmfusion-nonfree-release-$(rpm -E %fedora).noarch.rpm

After completing the above commands, run the following to update your system.

sudo dnf upgrade –refreshsudo dnf groupupdate core

Firmware Updates

If your hardware manufacturer supports a special firmware package for Linux, you can quickly check them and get those updates via the following sequence of commands. However, it may not always be available, but it is worth trying.

sudo fwupdmgr refresh –forcesudo fwupdmgr get-updatessudo fwupdmgr update

Learn to use new version dnf5

The new and advanced version of dnf package manager dnf5 has been included since the last few Fedora releases in the repo. It will be available by default from the future release onwards.

However, you can install it today and take advantage of the fastest dnf ever. Here’s how you can install it.

sudo dnf install dnf5 dnf5-plugins

After installation, you can start using dnf5 instead of dnf from the command line. You can check man pages dnf5 –help for more commands.

Installing dnf5 in Fedora 40

Install GNOME Tweaks

The GNOME Tweaks is the essential application for Fedora 40 Workstation. It helps you manage many areas of your GNOME desktop, such as changing fonts, applying GTK themes, etc. To install it, open a terminal and run the following command.

sudo dnf install gnome-tweaks

Explore unrestricted flathub apps

Fedora 40 pre-loads Flatpak by default. It also enables unrestricted access to all Flathub apps. All you need to do is to enable the flathub remote (which is disabled by default) using the below command.

flatpak remote-modify –enable flathub

So, you can now visit Flathub’s official website to install thousands of Flatpak apps. Alternatively, you can use GNOME Software to install Flathub apps, which now brings a better experience of Flatpak apps such as it is now possible to remove user data when uninstalling Flatpaks.

Enabling Flathub in Fedora 40

Install Extension Manager App

After you set up Flathub in the above step, install the most needed app, i.e. “Extensions Manager“. It allows you to search, install and remove hundreds of GNOME extensions right from the desktop. You do not need to visit the official web page to install it.

To install the Extension Manager app, open a terminal and run the following.

flatpak install flathub com.mattjakeman.ExtensionManager

Wondering which extensions to install? Check out the next tip.

Install these recommended GNOME Extensions

You can extend the GNOME 46 experience with more extensions.

You can also check out specific customizations of quick settings and GNOME’s top bar using the guides below.

Bonus Tip(s)

And finally, here are four bonus tips exclusively for you.

Install Recommended Applications

The default Fedora 40 workstation brings only default applications, which are not sufficient for the functioning of the desktop. Here’s a quick set of commands which enables you to install them. They include a torrent client, a good media player, a little advanced photo editor, etc.

Copy and paste these into the terminal to install.

sudo dnf install -y vlcsudo dnf install -y steamsudo dnf install -y transmissionsudo dnf install -y gimpsudo dnf install -y gearysudo dnf install -y dropbox nautilus-dropboxsudo dnf install -y unzip p7zip p7zip-plugins unrar

If you prefer Flatpaks, here’s the command for that.

flatpak install flathub org.videolan.VLC
flatpak install flathub com.valvesoftware.Steam
flatpak install flathub com.transmissionbt.Transmission
flatpak install flathub org.gimp.GIMP
flatpak install flathub org.gnome.Geary

Enable Battery percentage in tray (not in quick settings)

If you want to view the battery percentage at the system tray, run the following command to show it via settings.

gsettings set org.gnome.desktop.interface show-battery-percentage true

Install nice-looking fonts

GNOME desktop’s default font on Fedora 40 is perfect. But if you crave more, here are some of the cool fonts you can install. After installation, you can use the GNOME Tweak Tool to change.

sudo dnf install -y ‘google-roboto*’ ‘mozilla-fira*’ fira-code-fonts

TLP

Finally, you should install TLP if you are a Laptop user. TLP is a great utility to help optimize your Laptop’s battery. This utility comes with various command-line options to tweak and view reports about power consumption. All you need to do is install and forget it. It takes care of the basic power-saving optimizations. Remember not to use TLP with any other power management tweaks.

sudo dnf install tlp tlp-rdw

Closing Notes

I hope you enjoyed reading these tips and applying some of them. So, what is your favorite must-do post-install tip? Let me know in the comment box down below!

Kali Linux 2023.3 Release (Internal Infrastructure & Kali Autopilot)

0


Today we are delighted to introduce our latest release of Kali, 2023.3. This release blog post does not have the most features in it, as a lot of the changes have been behind-the-scenes, which brings a huge benefit to us and an indirect positive effect to you as end-users. It always goes without saying, but there are a number of new packages and tools as well as the standard updates. If you want to see what’s new for yourself download or upgrade if you have an existing Kali Linux installation.The highlights of the changelog since the 2023.2 release from May:Internal InfrastructureWith the release of Debian 12 which came out this summer, we took this opportunity to re-work, re-design, and re-architecture our infrastructure. It is as massive as it sounds, and should not be a surprise that its not yet complete! This is where a good amount of our focus has been for this release-cycle (and also the next one unfortunately). We are hoping that the majority of it will be done by the end of the year (so we can get back to what we do best!)This gives an excuse and the motivation to simplify our software stack as much as possible.
Example, using one single:OS version (Debian 12)CDN/WAF (Cloudflare)Web server service (Nginx)Infrastructure as Code (Ansible)We also have some other goals, and replacing certain software with others (phase #2).At the same time, we have automated some actions such as:We are very much underway with these projects already (as bug bounty hunters may notice the changes)!Mirror TracesWe have a new sub-domain, mirror-traces.kali.org! This is to help mirror admins for our community mirrors. This now gives everyone using it more details and insight which is useful when troubleshooting and debugging issues.True to our word, we are doing more in the open, the git repository can be found here: gitlab.com/kalilinux/tools/mirror-status.Packaging ToolsFor a long time, we have shared our home-made scripts publicly, which is our helping aid to manage all our packages in Kali.
Recently we have expanded on them by giving the existing files a refresh by adding additional features and various quality-of-life improvements, as well as including new ones.As a recap, if you want to have a peek at some back-end development:AutoPkgTest – Using debci in a CI fashion, we can test packages being built.This integrates into Britney.Britney2 (Git repo) – Migrates packages between all of our suites (aka branches, such as “debian-testing”, “kali-rolling”, and “kali-last-snapshot” to name a few).Build-Logs – Output of our images/platform as well as packages being created on each supported architecture.Janitor – This is our automated packager as it will apply everything from minor formatting changes to preparing an package update.The long term goal of this is to have it handle kali-bleeding-edge, linking into AutoPkgTest.Package Tracker – Tracks each packages version’s history.Packaging CI Overview (Git repo) – Quick (and dirty) overview of our package’s CI status.Upstream-Watch (Git repo) – Monitors when there is an update upstream.Kali AutopilotWith the release of Kali Purple in Kali 2023.1, we also had the debut of Kali Autopilot. Since then, its been worked on and is unrecognizable with its redesigned GUI and multitudinous amount of features added.What is Kali Autopilot? We are glad you asked!
Kali Autopilot is an automated attack framework. It is a bit like an “AutoPwner”, which follows pre-defined “attack scenarios”.
The motivation originally started its development for the defensive side of Kali.It is a lot easier to demonstrate Kali’s offensive side, especially when you start seeing the shells popping up.
But when it comes to the defensive side, how do you know if you have set things up? You start to ask questions:Are the Intrusion Detection System (IDS) and the Web Application Firewall (WAF) detecting malicious activities?Is the Security information and event management (SIEM) ingesting the right logs?Are the dashboards and alerts tuned to detect attacks?Are the analysts trained in finding the needle in the haystack?Has it been tested? How can you test?Either you can wait for someone to try and break in, or you could do it yourself. This is where Kali Autopilot comes in.Kali Autopilot consists of a GUI tool to design attacks and to generate attack scripts that perform those attack sequences, either manually or as a service, together with a web API interface for remote control.
You can also download example attack scripts from the Kali Purple Hub. We currently have scripts for juice-shop and DWVA. Just download the JSON from the hub and import it into Kali Autopilot.This tool has come along a lot in the last 6 months, and no plans on slowing down. As always, its shaped by the community; ideas, features, and direction can be submitted and shaped by YOU.
If you have developed attack scripts for vulnerable machines, we would love to include it on our Kali Purple Hub.We will kick it off with what’s new (to the network repositories):Calico – Cloud native networking and network securitycri-tools – CLI and validation tools for Kubelet Container Runtime InterfaceHubble – Network, Service & Security Observability for Kubernetes using eBPFImHex – A Hex Editor for reverse engineers, programmers and people who value their retinas when working at 3 AMkustomize – Customization of kubernetes YAML configurationsRekono – Automation platform that combines different hacking tools to complete pentesting processesrz-ghidra – Deep ghidra decompiler and sleigh disassembler integration for rizinunblob – Extract files from any kind of container formatsVillain – C2 framework that can handle multiple reverse shells, enhance their functionality and share them among instancesWe also bumped the Kali kernel to 6.3.7.Along with new tools being added to Kali, there has been numerous packages and libraries updates, both major and minor version such as: Greenbone, Humble, Impacket, jSQL, OWASP ZAP, Rizin, Tetragon, theHarvester, Wireshark and many many more.Unfortunately we had to drop a few packages from Kali:king-phisher – The tool is no longer maintained by the original authorAs an alternative, check GoPhish as a replacementplecost – Tool does not work with Python 3.11, and no response from original authorFor an replacement, try WPScanWe get a large amount of requests to add tools into Kali. We do have a policy of what tools are added to Kali and a process of how tools are packaged up and added (from network repositories to the default installed toolset). The draw back is that we do not have enough human power to be able to process them all.
Our solution to this has been to help tool authors and/or anyone from the Kali community create packages by writing a series of detailed, step-by-step guides covering the complete process and workflow of how we built those packages:When the tool was originally submitted by the tool author, we reviewed it, liked it, and agreed it should be in Kali. We did not have the cycles to process it ourselves quick enough, but the tool author did. They step up, and then re-submitted it again with them packaging up their tool.
This saved us a lot of leg work, so reviewing the package became a breeze, and shortly after was added into Kali.If you are wanting a tool added into Kali – and you would like for it to happen sooner than we can do, have a go at trying to package yourself!
There are other sources of doing “Debian packaging” out there, as well as our linked guides above. There is a initial learning curve, but its not as complex as you may think (especially if you are comfortable using Linux).Please note, we compile packages from source. Submitting a binary *.deb file will not be accepted.MiscellaneousBelow are a few other things which have been updated in Kali, which we are calling out which do not have as much detail:Added Pipewire support when using Hyper-V in enhanced session modeAdded kali-hidpi-mode to support Kali-PurpleImproved installation of Kali-Purple by removing the need to run any commands after installing kali-themes-purpleKali-Purple has a purple menu icon!The final reminder about the breaking change with Python 3.12 & PIPKali NetHunter UpdatesWe are proud to introduce a redesigned Kali NetHunter app and a completely new NetHunter Terminal, thanks to the amazing work of our very own @martin and @yesimxev.On the Kali NetHunter kernel side, there are numerous updates:LG V20 for Lineage 19.1Nexus 6P for Android 8.0 (Oreo)Nothing Phone (1) for Android 12 (Snow cone) and 13 (Tiramisu) (new)Pixel 3/XL for Android 13 (Tiramisu)Samsung Galaxy A7 for LineageOS 18.1 (new)Xiaomi Mi A3 for Lineage 20Xiaomi Redmi 4/4X for VoltageOS 2.5Also worth mentioning:By popular demand we have added a SELinux disabler.Please note until we are able to replace Mana Toolkit, we have had to temporary downgrade iptables.Kali ARM UpdatesThe Raspberry Pi Zero W image now boots to CLI and not GUI.
This change is in line with what we did with the Raspberry Pi 1 image a few releases ago. If you do not create a wpa_supplicant.conf to use, the easiest way to connect to a Wi-Fi network on the command line is to use the nmtui command.
Alternatively, you can use sudo nmcli –ask dev wifi connect network-ssid to have it ask you for the password on the command line, without it showing up in your history.USBArmory MKI and MKII have had their bootloaders updated to 2023.07.The ARM build scripts have had some minor tweaks to deal with policykit updates to make sure the pkla files are properly created.Kali Website UpdatesOur Kali documentation has had various updates to existing pages as well as new pages:A website is never complete, and our homepage is no exception. Recently we have been making some improvements:Get Kali – Should be a little easier to scroll and move about the page now, switching between platformsPartnerships – Updated to say a thank you to the new partnerships!Since our last release, we also did the following blog posts:These are people from the public who have helped Kali and the team for the last release. And we want to praise them for their work (we like to give credit where due!):Anyone can help out, anyone can get involved!New Kali MirrorsWe have another community mirror:If you have the disk space and bandwidth, we always welcome new mirrors.Kali Team Discord ChatSince the launch of our Discord server with Kali 2022.3, we have been doing an hour long voice chat with a number of Kali team members. This is when anyone can ask questions (hopefully relating to Kali or the information security industry) to us.The next session will happen a week after the release, Wednesday, 30th August 2023 16:00 -> 17:00 UTC/+0 GMT.Please note we will not be recording this session. This is a live event only.Get Kali Linux 2023.3Fresh Images:
Simple, Get Kali!Did you know, we do also produce weekly builds that you can use as well.
These are for people who cannot wait for our next release and you want the latest packages (or bug fixes). This way you will have fewer updates to do.
Just know that these are automated builds that we do not QA like we do our standard release images. But we gladly take bug reports about those images because we want any issues to be fixed before our next release!Existing Installs:
If you already have an existing Kali Linux installation, remember you can always do a quick update:┌──(kali㉿kali)-[~]
└─$ echo “deb http://http.kali.org/kali kali-rolling main contrib non-free non-free-firmware” | sudo tee /etc/apt/sources.list
[…]

┌──(kali㉿kali)-[~]
└─$ sudo apt update && sudo apt -y full-upgrade
[…]

┌──(kali㉿kali)-[~]
└─$ cp -vrbi /etc/skel/. ~/
[…]

┌──(kali㉿kali)-[~]
└─$ [ -f /var/run/reboot-required ] && sudo reboot -f
You should now be on Kali Linux 2023.3 We can do a quick check by doing:┌──(kali㉿kali)-[~]
└─$ grep VERSION /etc/os-release
VERSION=”2023.3″
VERSION_ID=”2023.3″
VERSION_CODENAME=”kali-rolling”

┌──(kali㉿kali)-[~]
└─$ uname -v
#1 SMP PREEMPT_DYNAMIC Debian 6.3.7-1kali1 (2023-06-29)

┌──(kali㉿kali)-[~]
└─$ uname -r
6.3.0-kali1-amd64
NOTE: The output of uname -r may be different depending on the system architecture.As always, should you come across any bugs in Kali, please submit a report on our bug tracker. We will never be able to fix what we do not know is broken! And Social networks are not bug trackers!Want to keep in up-to-date easier? Automate it! We have a RSS feeds and newsletter of our blog to help you.

How to Install React.js on Ubuntu 24.04

0
How to install React.js on Ubuntu 24.04

This tutorial will show you how to install React.js on Ubuntu 24.04 OS.
React.js is a free and open-source Javascript library for building user interfaces based on components. React.js is written in Javascript, and with this software, we can develop single pages and mobile applications and render only specific parts of the pages that have changed. In this blog post, we will install NodeJS and NPM, which are required for the React.js application.
Installing React.js on Ubuntu 24.04 is straightforward and may take up to 10 minutes. Let’s get started!

Prerequisites to Install React.js on Ubuntu 24.04

Step 1. Update the system
Every fresh installation of Ubuntu 24.04 requires the packages to be updated to the latest versions available. To do that, execute the following command:
sudo apt update -y && sudo apt upgrade -y
Step 2. Install NodeJS
NodeJS is very important for every Javascript application. To install the NodeJS version, follow the steps below:
sudo apt install nodejs -y
After successful installation, you can check the NodeJS version by executing the command below:
node -v
You will get output similar to this:
root@host:~# node -v
v18.19.1

Step 3. Install NPM
Next, we will install the NPM package manager. To install it, execute the command below:
sudo apt install npm -y
After the installation, check the version:
npm -v
You should get output similar to this:
root@host:~# npm -v
9.2.0

Step 4. Install React.js
Before we install React.js, we need to install the package for creating React.js applications:
npm install -g create-react-app
After installation, check the installed version:
create-react-app –version
You should get the following output:
root@host:~# create-react-app –version
5.0.1
We can finally create the React.js project with the command below:
create-react-app rhtest
Once executed, the installation process will start:
root@host:~# create-react-app rhtest

Creating a new React app in /root/rhtest.

Installing packages. This might take a couple of minutes.
Installing react, react-dom, and react-scripts with cra-template…

You should allow some time for the installation to complete. There will be output similar to this:
Success! Created rhtest at /root/rhtest
Inside that directory, you can run several commands:

npm start
Starts the development server.

npm run build
Bundles the app into static files for production.

npm test
Starts the test runner.

npm run eject
Removes this tool and copies build dependencies, configuration files
and scripts into the app directory. If you do this, you can’t go back!

We suggest that you begin by typing:

cd rhtest
npm start

Happy hacking!
As you can see, there are commands to start the React.js server, but we will use another method in the next step.
Step 5. Create React.js service
Creating a systemd service file will help us to easily manage our React.js application with a couple of commands, which will be explained in the next paragraphs. Let’s first create the service file:
touch /lib/systemd/system/reactjs.service
Open the file with your favorite editor and paste the following lines of code:
[Service]
Type=simple
User=root
Restart=on-failure
WorkingDirectory=/root/rhtest
ExecStart=npm start — –port=3000

Save the file, close it, and reload the daemon with the command below:
systemctl daemon-reload
Once done, start and enable the React.js service:
sudo systemctl start reactjs && sudo systemctl enable reactjs
To check the status of the React.js service:
sudo systemctl status reactjs
You should receive output similar to this:
root@host:~/rhtest# sudo systemctl status reactjs
● reactjs.service
Loaded: loaded (/usr/lib/systemd/system/reactjs.service; static)
Active: active (running) since Sun 2024-03-17 18:33:12 CDT; 3s ago
Main PID: 127933 (npm start –por)
Tasks: 37 (limit: 4624)
Memory: 108.0M (peak: 108.4M)
CPU: 3.862s
CGroup: /system.slice/reactjs.service
├─127933 “npm start –port=3000”
├─127949 sh -c “react-scripts start –port=3000”
├─127950 node /root/rhtest/node_modules/.bin/react-scripts start –port=3000
└─127961 /usr/bin/node /root/rhtest/node_modules/react-scripts/scripts/start.js –port=3000

Mar 17 18:33:12 host.test.vps systemd[1]: Started reactjs.service.
Mar 17 18:33:13 host.test.vps npm[127933]: > [email protected] start
Mar 17 18:33:13 host.test.vps npm[127933]: > react-scripts start –port=3000

Now, you can access your React.js on port 3000 at the URL: http://YourServerIPAddress:3000.

Congratulations! You successfully learned how to install React.js on Ubuntu 24.04
Of course you do not have to install it on your own. All you have to do is sign up for one of our NVMe VPS plans and submit a support ticket. Our admins are available 24/7 and will help you with any aspect of installing React.js.
If you liked this post on how to install React.js on Ubuntu 24.04, please share it with your friends on social networks or simply leave a reply below. Thanks.

Parsero — Scan for Vulnerability

0
Parsero -- Scan for Vulnerability

The world of cybersecurity is really thrilling where every click, tap and byte counts. Today, we are going to learn the basics with a nifty tool called Parsero on our Kali Linux system.Parsero is like a digital bloodhound with a mission to sniff out vulnerabilities in websites. It’s basically like our cyber detective buddy, equipped with the skills to uncover any hidden threats lurking in the depth. Now let’s get our hands dirty and dive into the action.First of all we need to have Parsero tool on our system. Don’t worry it comes pre-installed with our Kali Linux full version if not we can simply install it by using following command on our Kali Linux Terminal:-sudo apt install parsero -yThen it will prompt for our root password and it will be installed within some seconds.Before use Parsero on our Kali Linux system let we check the options of this tool by using following command:parsero -hThe above command will show the help of Parsero tool as we can see it on the following screenshot.Let’s run it against a target. Lord Google can be an example just for scanning purpose. We are not really attacking the Lord of surface internet. We should not attack any website without proper legal written permission. We can create our own vulnerable site for that. So the command will be as following:parsero -u https://www.google.comWe can see the result of the above command in the following screenshot:In the above screenshot we can see that Parsero is performing well and finding some directories.Parsero is actually a Python script which reads the robots.txt of a website and looks at the Disallow entries. The Disallow
entries tell the search engines what directories or files hosted on a
web server mustn’t be indexed. For example, “Disallow: /portal/login”
means that the content on www.example.com/portal/login it’s not allowed
to be indexed by crawlers like Google, Bing, Yahoo etc. This is the way
the administrator have to not share sensitive or private information
with the search engines.
But sometimes these paths typed in the Disallows entries are directly
accessible by the users without using a search engine, just visiting
the URL and the Path, and sometimes they are not available to be visited
by anybody. Because it is really common that the administrators write
a lot of Disallows and some of them are available and some of them are
not, we can use Parsero in order to check the HTTP status code of each
Disallow entry in order to check automatically if these directories are
available or not.
Also, the fact the administrator write a robots.txt, it doesn’t mean
that the files or directories typed in the Dissallow entries will not
be indexed by Bing, Google, Yahoo, etc. For this reason, Parsero is
capable of searching in Bing to locate content indexed without the web
administrator authorization. Parsero will check the HTTP status code in
the same way for each Bing result.We can see there are a lots of red lines on Parsero result which indicates200 OK               The request has succeeded.403 Forbidden    The server understood the request, but is refusing to fulfill it.404 Not Found    The server hasn’t found anything matching the Request-URI.302 Found           The requested resource resides temporarily under a different URI (Uniform Resource Identifier).If we want to see only the “HTTP 200” status code then we have to use the -o flag just like following:parsero -o -u https://www.google.comIn the following screenshot we can see only the “HTTP 200” status codes.Also If we have a list of domains to run Parsero then we can note down those websites on a text file each on a line just like the following screenshot: If we have another targets we can add them like the above. Now we can scan the list with Parsero. Before that we need to specify our website’s list named ‘targets.txt’, which is stored on our Desktop and we also want to see “HTTP 200” status codes only. So our command will be following:parsero -o -f ~/Desktop/targets.txtAfter running the above command Parsero will start scanning the websites given on the list as we can see in the following screenshot.Once Parsero completes its scan, it’ll spit out a detailed report
highlighting any potential vulnerabilities it found. We need to pay close attention
to these findings as it will give us valuable insights into how secure
(or not-so-secure) the website is.And there we have it, folks! We’ve just dipped our toes into the
world of cybersecurity with Parsero on Kali Linux. But remember, this is
just the beginning. The cyber realm is vast and ever-evolving, so we need to stay
curious, keep learning, and never underestimate the power of a good
cyber tool in our arsenal. Happy hunting, and may the digital winds be
ever in our favor!Love our article? Make sure to follow us on Twitter and GitHub, we post article updates there. To join our KaliLinuxIn family, join our Telegram Group & Whatsapp Channel We are trying to build a community for Linux and Cybersecurity. For anything we always happy to help everyone on the comment section. As we know our comment section is always open to everyone. We read each and every comment and we always reply.