MATLAB Is How Data Science Is Really Used By Engineers and Pioneers. Here’s How To Learn It


Look, we’re not going to try to oversell this. If you’re just a casual dabbler or even an only semi-curious rubber-necker on true scientific tech innovation, then you might just want to steer clear of MATLAB.

It’s not that it’s incredibly hard to get a handle on this math-based programming language used by engineers and scientists. It’s actually more relatable for newcomers than you’d likely think. Yet scientific programming, apps built around data analysis, modeling, simulations, and visualizations, are often dealing with data sets that are enormously vast and scientific problems with far more questions than answers.

Playing on those levels, MATLAB is not like learning Python or Ruby on Rails. Yet with the coursework found in The Complete MATLAB Programming Certification Bundle, budding engineers, programmers and scientists not only get a feel for this highly specialized environment, but they also acquire a true taste of how today’s top data science questions and puzzles are actually being solved.

Over this collection of seven courses, students get hands-on instruction in this primary data analysis tool that’s been aiding hardcore scientific exploration since the 80s.

Complete MATLAB Programming for Beginners kick starts the training, a strong introduction for first time users in the key concepts and abilities of the software. Before too long, students will try their hand at building algorithms, generating 2D and graphs and even assembling their own animations.

The extent of MATLAB’s modelling and simulation capabilities are explored in MATLAB/SIMULINK Masterclass: From a Beginner to an Expert through real world examples.. Then, Data Preprocessing for Machine Learning Using MATLAB and Machine Learning for Data Science using MATLAB show users how to start applying MATLAB’s talents to one of the most influential fields in tech today: artificial intelligence and machine learning.

And if your interest in MATLAB is more grounded in practical concerns, you’ll also find three courses covering how MATLAB can manage electrical power systems, including standard household current all the way up to city-wide power stations and even solar energy.

Right now, you can get this full-scale MATLAB indoctrination, a $3,000 value, for a fraction of that price, just $34.99.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

Nvidia RTX 3090 Review Roundup: Absolute Creative Power, Maximum Chonk


Reviews of the RTX 3090 make it clear this GPU is a lot of things. It’s about the weight of a small roasted chicken (RTX 3090: 4.84 pounds; roasted chicken: 5-7 pounds). It’s longer than the Xbox Series X is tall. It’s a true triple-slot GPU and packs 24GB of VRAM. If you choose to wield an RTX 3090 as an offhand weapon, you will suffer a -4 / -8 to attacks due to its off-balance weight and heft. This may be reduced to 0 / -4 if you took graphics cards as an exotic weapon proficiency.

Beyond these metrics, the RTX 3090 is more. It’s a heck of a lot more expensive than the just-launched RTX 3080 and it offers about 1.15x additional performance compared with that card. Nvidia has used this launch to push the idea that the RTX 3090 is an 8K gaming card or a GPU for ultra-high-end prosumers. Whether that’s true is a little more complicated. 8K gaming isn’t in its infancy. 8K gaming is still in the womb.

We’ve rounded up coverage from multiple publications, including Eurogamer, Hot Hardware, and PC Gamer. The general opinion on the RTX 3090 is that the 24GB of VRAM can be genuinely useful in certain 4K+ content creation workloads. It could also be useful if you are working extensively with AI modeling, depending on the needs of your model. As far as its actual usefulness in gaming, however, 24GB is overkill.

All Top-End GPUs Are Bad Deals, but Some Are Worse Than Others

In order to talk about whether the RTX 3090 is a good deal, we have to sandbox the problem a bit. Objectively speaking, the one thing the RTX 3090 offers that the RTX 3080 cannot match is the additional 14GB of VRAM. For certain prosumers, this alone can justify the purchase, especially if you want one GPU for both your 4K+ video editing and your personal gaming without compromising on features or performance for either.

For everyone else, this GPU is an objectively bad deal. The RTX 3080 is (or will be, once you can buy one) half the price and about 85 percent the performance. Given how good Ampere’s performance is to start with, the RTX 3080 is a great pick for anyone who wants to game at the highest frame rates and resolutions, but still has a gasp of concern for affordability and price/performance ratio. The RTX 3090, not so much.

But.

The RTX 3090’s overall feature set is a much better match for its price point than the RTX 2080 Ti’s was. The 8K gaming claim is a little wobbly, but that’s partly because standing up 8K gaming is itself a little wobbly right now. Because basically nobody has 8K displays, Nvidia created an 8K DLSS mode (upscaling 1440p to 8K). It’s also possible to use dynamic super resolution (DSR) to push the resolution up that high via supersampling. The reviewers that tested the GPU in 8K all appear to have used this method.

Image by Hot Hardware

Even among the relative handful of customers who buy GPUs like this in the first place, 8K gaming is going to be the nichest of niche applications. Most users will encounter this card at 5K or below, and it excels at those frame rates, even if they don’t show off its advantages over the RTX 3080 to quite the same extent.

The RTX 3090 is not a good value for most customers. But evaluated against previous top-end cards, I’d say it’s the best value since the GTX 1080 Ti. The general opinion of the reviewers that tested it agrees:

Eurogamer writes: “All told then, RTX 3090 is the traditional hard sell for the mainstream gamer but the high-end crowd will likely lap it up.”

Hot Hardware’s conclusion echoes the points above. It notes that the GPU is only 4-20 percent faster than the RTX 3080 for regular gamers before noting: “Consider complex creator workloads which can leverage the GeForce RTX 3090’s additional resources and memory, however, and it is simply in another class altogether and can be many times faster than either the RTX 3080 or Titan RTX.”

Image by PC Gamer

PC Gamer also emphasizes the content creator aspect, stating: “This is the Ampere generation’s Titan. That’s how you justify selling a GeForce GPU, with only 11 percent higher 4K gaming performance over its closest sibling, with a 114 percent higher sticker price.”

In their conclusion, PCG leans harder into the “Not for gamers” than other reviews, saying:

This is a toughie because there are very, very few people who I would recommend the $1,500 RTX 3090 to. And none of them are gamers…This is every inch the Titan card Jen-Hsun said it would be. It’s a creator’s card, one with a stunningly powerful GPU and a frame buffer that allows personal creation on a level not seen on a GPU this side of $6,000.

In short, this GPU may be the best thing that ever happened to you, if you’re a creative type with $1,500 to drop on a high-end card with dramatically better performance. The rest of us can more than make do with the excellently positioned RTX 3080.

Your move, AMD.

Now Read:

ET Deals: Dell New G5 Gaming Desktop for $599, Samsung Galaxy Note 20 5G for $799


If you’re looking for a PC to run the latest games on then you should consider Dell’s new G5 gaming desktop. This system has an Intel Core i5 processor along with an Nvidia GTX 1650 Super GPU that gives it sufficient power to run most games at 1080p resolutions with ease. The system’s also on sale and can be picked up for just $599.99.

Dell G5 Intel Core i5-10400F Gaming Desktop w/ Nvidia GeForce GTX 1650 Super GPU, 8GB DDR4 RAM and 1TB 7,200RPM HDD ($599.99)

Dell built this gaming desktop with an Intel Core i5-10400F and a GeForce GTX 1650 Super graphics processor. Together, this hardware can run games with high settings at 1080p resolution. The system also has a unique front panel that looks cool and edgy, and with promo code G5DTAFF1 you can get it now marked down from $799.99 to just $599.99 from Dell.

Samsung Galaxy Note 20 5G 128GB 6.7-Inch Unlocked Smartphone ($799.99)

Coming equipped with Qualcomm’s Snapdragon 865+ SoC, the Galaxy Note 20 should be exceptionally fast with performance on par to today’s latest flagship phones. The phone also utilizes a Dynamic AMOLED 2X Infinity-O display with a 120Hz refresh rate that measures 6.7-inches diagonally. It also has 128GB of storage along with 8GB of RAM and a 4,300mAh battery. For a limited time you can get this new cutting edge smartphone from  Amazon marked down from $999.99 to just $799.99.

Asus VivoBook 15 F512JA Intel Core i5-1035G1 15.6-Inch 1080p Laptop w/ 8GB DDR4 RAM and 512GB M.2 NVMe SSD ($599.99)

This computer comes with an Intel Core i5-1035G1 quad-core processor that operates at speeds of up to 3.6GHz. The notebook is also relatively light at 3.75 pounds and quite thin at 19.9mm tall. Asus also tossed in a 1080p display and a built-in fingerprint scanner, making this system a well rounded solution for web browsing and simple office work. Right now you can get it from Amazon for $599.99.

Seagate Barracuda STJM2000400 2TB External USB-C SSD ($279.99)

Built using SSD NAND, this external drive is able to transfer data relatively fast at a rate of up to 540MB/s. The drive also can hold up to 2TB of data, which makes it an excellent option for anyone that needs a large amount of fast portable storage. Currently you can get one from Amazon marked down from $349.99 to just $279.99.

Dell Vostro 3671 Intel Core i5-9400 Desktop w/ 8GB DDR4 RAM and 256GB HDD ($499.00)

This desktop comes equipped with a six-core processor that operates at up to 4.1GHz and offers solid performance for everyday tasks. This model also has 8GB of RAM and a 256GB SSD. Right now it’s marked down from $998.57 to $499.00 from Dell.

Logitech M330 Silent Plus Wireless Mouse ($14.99)

The M330 mouse from Logitech was built to be an affordable wireless mouse with solid performance. The mouse reportedly produces 90 percent less noise when clicked than a standard mouse and it can also last for up to 2 years on a fully charged battery. Amazon is offering these mice at the moment marked down from $29.99 to just $14.99.

Note: Terms and conditions apply. See the relevant retail sites for more information. For more great deals, go to our partners at TechBargains.com.

Now read:

Scientists Sequence Genome of Mold That Gave Us Penicillin, the First Antibiotic

Credit: CABI

(Credit: CABI)The discovery of antibiotics by Scottish scientist Alexander Fleming is one of humanity’s greatest achievements. Suddenly, diseases that plagued humanity for generations were treatable with a few injections. And it’s all thanks to a strain of Penicillium fungus. Now, a team from Imperial College London and Oxford University has revived the mould to sequence its genome for the first time

Fleming did not set out to change medicine in 1928, but he did notice that the Penicillium fungus growing in one of his petri dishes had effectively killed the cultured staphylococcus bacteria. It is from this mold that Fleming isolated Penicillin. Molds like Penicillium rubens produce antibiotic compounds naturally. In the case of Penicillin, the molecule contains a β-lactam ring structure that interferes with the ability of bacteria to make new cell wall segments. Without a cell wall, the bacterium dies quickly under most circumstances. 

It would not be an exaggeration to say that Penicillin changed the world. It won Fleming a Nobel Prize in 1945. Recognizing the importance of the original strain (above), scientists cryogenically preserved it for future study. The UK team was interested in performing some experiments on that original strain but then realized no one had ever sequenced the genes of Fleming’s Penicillium. So that’s what they did. 

The team compared the Fleming strain to two commercial strains of Penicillium mould developed in the US. The US strains were based on a wild strain found on a cantaloupe, but they were also the subject of early attempts at genetic manipulation and artificial selection. Scientists in the 20th century bombarded the fungus with X-rays and carefully cultivated the spores that produced the highest levels of penicillin. 

Penicillium growing on an orange.

Mutating the genome boosted penicillin production, but the changes were not as dramatic as expected. The team looked at the genes for penicillin production and the genes that regulate that production. Interestingly, the regulatory system seems identical in Fleming’s mould and the mutated US strains. However, the US strains have more copies of the genes to produce the molecule. The team also found some differences in the encoding genes, which suggests natural evolutionary changes based on the bacteria the mould encounters in its natural environment. 

By learning more about how Penicillium rubens changed to serve humans, we might discover new methods of cultivation and optimization. The team has made the study open access on the Nature website so anyone can check out the sequence.

Now read:

Nuvia Raises $240M for CPU Development, Releases New Details


We’ve been following some of the smaller CPU vendors like Nuvia and Ampere that have emerged as potential challengers to x86 in the hyperscale server industry. This time around, Nuvia is the news — I’ll spare you the pun — for raising a massive $240M in funding as it seeks investor support to challenge companies like Intel and AMD.

Funding rounds aren’t the sort of thing we cover much at ET, but I had the opportunity to chat with John Bruno, SVP of Nuvia, and a previous SoC and chip developer with Google, Apple, AMD, and ATI. One question that’s been on my mind, since Nuvia announced its Phoenix CPU is much faster than Zen 2 while using less power, is this: Is the company betting its performance on the ARM architecture, specifically?

This isn’t just an incidental point. Talk to an x86 engineer — from either Intel or AMD — and they’ll tell you that the decode penalty x86 pays for turning CISC into RISC inside the core is tiny these days. On the ARM side of things, there’s a myth that floats around claiming that ARM chips can beat x86 because of some supposed inefficiency between CISC and RISC designs. It’s an argument that’s literally more than 25 years out of date, unless we’re talking about the performance of Intel’s Medfield versus a Cortex-A9. The Bonnell and Saltwell-core Atoms (OG 45nm and its 32nm die shrink) are the only chips that decode native x86 that aren’t old enough to vote.

According to Bruno, the idea that the CPU’s high performance requires the ARM ISA isn’t entirely true, though Nuvia’s first-generation chip is implemented using one of ARM’s custom architecture license. According to him, the core’s expected performance is the result of “micro-architecture, architecture, and implementation.”

Note: We typically use micro-architecture and architecture practically synonymously, but they aren’t synonymous. In this context, “architecture” refers to the instruction set architecture (ARMv8.x or ARMv9). “Microarchitecture,” then, is the specifics of how a semiconductor company executes an ISA within the CPU. “Implementation,” in the example above, refers to process node and foundry tech — basically, the improvements and advantages Nuvia expects its foundry partners to deliver on their side of the equation.

Nuvia’s goal is to deliver a CPU that can challenge companies like Intel and AMD — as well as Ampere, Graviton, and some of the other ARM players — across all fronts. This isn’t as banal as it sounds. We’ve seen companies adopt a variety of strategies in an effort to differentiate their products, including a number that emphasize high core counts as opposed to per-thread scaling. Nuvia’s claim, as stated in a blog post from August, is that its upcoming Phoenix core “performs up to 2X faster than the competition” when compared within a 1W – 4.5W envelope.

There are a few ways to read this. First, Nuvia is picking a data point that favors its own designs. x86 CPUs, generally speaking, don’t always run that low. AMD’s 3990X squeezes down into about 3W per-core at 3GHz. This is where Nuvia thinks it can offer still-higher efficiency.

The 3990X is itself an example of how potent these gains can be. If AMD could deliver 3GHz in 2W instead of 3W, it would knock 64W off its 280W TDP target or use the additional watt to bring clock speeds up per-core. Nuvia thinks it can hit these improvements even after considering the expected gains from CPUs like AMD’s upcoming Zen 3 architecture. AMD, if you ask (we asked) will tell you that it isn’t particularly worried.

In an article earlier this week, I argued that we’re on the cusp of the most exciting CPU market in 30+ years. Ironically, I forgot to mention Graviton, Amazon’s server CPU play, but they’re another ARM player to watch. Some readers have floated the idea that the x86 server market is simply too big, too optimized, for other companies to dent it. The list of companies that used to think this way is long, storied, and mostly dead. Of the various RISC vendors that believed their vertically-integrated semi-monopolies safe from Intel to one degree or another, the only one left standing is IBM, with a hardware business that’s a shadow of the titan it once was.

x86 has all of the powerful advantages of incumbency. It has the weight of familiarity, the robust ecosystem only a few decades of being “the standard” can bring you, and the attention of a large group of engineers from multiple companies, all of whom are dedicated to improving its performance. x86 is formidable, in ways people who mock the architecture seldom like to admit.

But formidable and “invulnerable” are not synonyms. The advent of AI and ML accelerators has at least temporarily cracked open a sclerotic market. Big things are afoot in the space, and while it’s going to be a few years before we see major changes, we’ll all be the beneficiaries of the renewed competitive focus in the CPU market long term.

Now Read:

Famous Black Hole Shows Its Wobbly Past in New Movie

The M87 supermassive black hole imaged earlier this year.

The release of the first-ever image of a real black hole in 2019 was a watershed moment for science, but there’s still more work to do. The Event Horizon Telescope (EHT) team is still planning future observations, but it’s also looking at old data to strengthen our understanding of how black holes work. The fruit of that labor is a short movie showing the evolution of the now-famous black hole over the past decade

Black holes are the collapsed remains of massive stars, and they were predicted by Einstein’s general relativity long before we ever found evidence of them. However, we could only infer the presence of the black hole from X-ray emissions and gravitational effects. Imaging the supermassive black hole at the center of the M87 galaxy in 2019 (based on data from 2017) was an incredible accomplishment and yet more confirmation of general relativity. 

The ETH team intends to conduct more observations of M87 and the central black hole of our own galaxy yearly in March or April. That’s when conditions are likely to be best for the network’s numerous telescopes around the world. However, the project was put on hold this year due to the COVID-19 pandemic. Instead, the team dug through old data on M87 to create images of its evolution over the past decade. 

The new animation of the black hole’s wobbly past comes from the old data, plus the mathematical model developed for the famous 2019 image. The result involves a little more guesswork than the last one — the earlier data didn’t have enough resolution for imaging the black hole, but it was consistent with the data acquired in 2017. So, it was possible to plug it into the existing model to get an idea of how M87 has changed over time. 

This animated GIF (above) shares several important features with the 2019 still. As expected, the central zone is dark because that’s where the event horizon is — anything that crossed that boundary is lost forever to the singularity’s crushing gravity. Around that is a bright ring known as the accretion disk where matter heats up as it spirals inward. One side of the ring is brighter than the other because one side (the bright one) rotates toward us and the other rotates away. 

Interestingly, the animation shows that bright section moving around quite a lot over the past decade. This could be due to small changes in the disk’s rotation that reinforce or cancel out the brighter regions. This might be normal for a black hole of this size, but we won’t know until the team has a chance to conduct more observations in 2021 and beyond.

Now read: