Best products from r/ComputerEngineering

We found 35 comments on r/ComputerEngineering discussing the most recommended products. We ran sentiment analysis on each of these comments to determine how redditors feel about different products. We found 26 products and ranked them based on the amount of positive reactions they received. Here are the top 20.

19. USB C to Ethernet Adapter, uni RJ45 to USB C Thunderbolt 3/Type-C Gigabit Ethernet LAN Network Adapter, Compatible for MacBook Pro 2020/2019/2018/2017, MacBook Air, Dell XPS and More - Gray

    Features:
  • 【LAN to USB-C Adapter】Obtain stable connection speeds up to 1Gbps; downward compatible with 100Mbps/10Mbps networks. Our Type-C to LAN Gigabit Ethernet (RJ45) Network Adapter supports large downloads at maximum speeds without interruption. (To reach 1Gbps, make sure to use CAT6 & up Ethernet cables.)
  • 【Reliable & Endurance Connectivity】Designed specifically for plug-and-play connection between USB-C devices and wired network, provides gigabit ethernet connectivity even when wireless connectivity is Inconsistent or over extended.
  • 【Thoughtful Design】Compact and lightweight, with a user-friendly non-slip design for easier plugging and unplugging. Braided nylon cable for extra durability. Premium aluminum casing for better heat dissipation. High-quality USB-C connector provides snug connection with your devices for stable signal transfer.
  • 【Wide Compatibility】Compatible with MacBook Pro 16''/15” (2020/2019/2018/2017), MacBook (2019/2018/2017), MacBook Air 13” (2018), iPad Pro (2020/2018); XPS 13/15; Surface Book 2; Google Pixelbook, Chromebook, Pixel, Pixel 2; Asus ZenBook. Compatible with Samsung S20/S10/S9/S8/S8+, Note 8/9, Galaxy Tablet Tab A 10.5, and many other USB-C laptops, tablets, and smartphones.
USB C to Ethernet Adapter, uni RJ45 to USB C Thunderbolt 3/Type-C Gigabit Ethernet LAN Network Adapter, Compatible for MacBook Pro 2020/2019/2018/2017, MacBook Air, Dell XPS and More - Gray
▼ Read Reddit mentions

Top comments mentioning products on r/ComputerEngineering:

u/gineton2 · 3 pointsr/ComputerEngineering

I'm about your age and taking CS and Engineering courses. The only way to find out is by trying. One quick and inexpensive way of getting your feet wet is by learning some basic coding online (for example, Harvard's CS50) and doing the first few electronics projects with an Arduino kit (like this one).

Then, I recommend doing lower-division prerequisites at community college to get a taste of the engineering curriculum. Specifically, you should take at least one programming class, Calculus 1, Physics 1, and an engineering class or two (hopefully one that is project-based or has a hands-on component). With these, you should have a good introduction to the different directions you can take.

You should also think about why you want to take Computer Engineering. CpE is a good major, but if your interest/goal is to work in software, Computer Science will be a more flexible major and usually have fewer requirements (read: you can graduate sooner). Don't get into the major just because of the engineer moniker, get into CpE because you want to have more flexibility in your career in working with hardware, software, and electronics. CS will give you a better background for a broad career as a software engineer, with more elective options. There are something like 10x more software jobs than hardware jobs. You can work in software with CpE, but the major itself can be pretty focused on electrical engineering, circuits, and hardware. This depends on your school.

Once you've taken the classes I suggested, you should be able to better decide what you're most interested in and how much math and physics you want to take. If you don't mind taking more math and physics fundamentals, then CpE can be a good fit for you. Depending on how you feel about your programming courses and hands-on engineering courses, you will have more clarity on whether you want to have more focus in hardware, software, or neither.

u/Statici · 1 pointr/ComputerEngineering

Hmm maybe. It's hard to say exactly what I want right now. My first introduction to actual computer science was through this book which is what ultimately lead me to studying theoretical physics (yeah I know, a weird pathway lol). I'm fascinated with the optimization of algorithms, and a friend recently recommended this book to me because my final project is building a 2's complement adder/subtractor & I thought about going further. In it it has this paragraph:

>The downside of potentially rising processor performance is an unprecedented increase in hardware and software complexity. The trend toward greater complexity is not only at odds with testability and certifiability but also hampers adaptability, performance tuning, and evaluation of the various trade-offs, all of which contribute to soaring development costs. A key challenge facing current and future computer designers is to reverse this trend by removing layer after layer of complexity, opting instead for clean, robust, and easily certifiable designs, while continuing to try to devise novel methods for gaining performance and ease-of-use benefits from simpler circuits that can be readily adapted to application requirements.

That's sort of...exactly what I want to do. Whether that is computer engineering, computer science, or both, idk. Where that might happen I don't know either - and it's not as if a recent speedup to carry-lookahead adders is going to make the news (but it would still interest me). Do you happen to know if there are any schools out there that do research focused specifically on that sort of stuff?

u/brucehoult · 1 pointr/ComputerEngineering

Welcome!

You need two books:

https://www.amazon.com/Computer-Organization-Design-RISC-V-Architecture/dp/0128122757

Get the original MIPS or later ARM version if you prefer -- they're absolutely fine, and the principles you learn one one apply to everything -- but the RISC-V one is the newest and is the only only one that you're actually legally allowed to make an implementation of at home and distribute, put on github etc.

But of course designing and making your own 16 bit ISA is huge fun, so I definitely recommend that too!

Once you've digested all that, their other book is more advanced. But the first one will get you a long way. This next one is the absolute bible of real computer architects and hardware designers.

https://www.amazon.com/Computer-Architecture-Quantitative-Approach-Kaufmann/dp/0128119055

That's by these guys, who originally invented the RISC-I and MIPS processors in the early 80s, invented the term "RISC" (and also RAID, btw). They recently received the Turing award for their lifetime efforts:

https://www.youtube.com/watch?v=3LVeEjsn8Ts

Join comp.arch on usenet / google groups. There are lots of actual working or retired computer architects there, and they're helpful to energetic students and amateurs designing their own toy stuff.

u/FUHGETTABOUTIT_1 · 0 pointsr/ComputerEngineering

I think a great way to get started in programming is HackerRank.
All the recommendations I’ve read here are very good so far. Also, look into
the basics of Networking protocols (SSL, TCP, etc..). Have some basic
knowledge on how the internet works, how the ISPs come into picture.
Do not use an IDE while taking programming classes!
Stick with a simple text editor like Vim (My everyday tool).
I’ve quizzed friends on who only use IDEs and its embarrassing when they
can’t even create a simple hello world program from scratch; they’re so use to having
the IDE create a template for them. Even if the instructors use IDEs, avoid them.
Understand how VM’s work (Dockers if you have some time). Get confortable debugging
your programs when they don’t work. Use a debugger, avoid print statements for now.
Trust me, you’ll be doing a ton of debugging when you’re out in the field.
Get confortable on command line. At least learn the very basics of
Linux/bash commands (ls, cd, mv, cp, etc..)

Read this book (It gives you great tips on how to learn and retain knowledge):
Make It Stick: The Science of Successful Learning
https://www.amazon.com/dp/0674729013/ref=cm_sw_r_other_apa_1x21AbKZZ3XGE

Don’t do drugs! Stay in school and keep on learning!

Best of luck!

u/csp256 · 2 pointsr/ComputerEngineering

Your university will let you take as many classes as you want.

You aren't in university to complete a checklist and move on. Take all the classes which will give you marketable skills.

PDE is pretty hit or miss. Almost every differential equation is analytically unsolvable. In most undergraduate ODE and PDE courses you will be wasting your time practicing how to solve one of a very small number of special cases by hand, without learning the deeper theory (because frankly, you're not ready to).

Yes, really; the way most everywhere teaches DE is useless and "done for historical reasons". I think they do it just because it makes their job (turning freshmen into grad students) easier by letting you practice calculus for a while.

At best you learn how to use integral transforms (Laplace transform, maybe with a mention to the Fourier transform) and maybe some connection to linear algebra (it might be the first time you see linearity used outside of a 'straight line' context).

In reality you will almost always need numerical techniques when it comes to differential equations. Those come in all shapes and sizes. Learn numerical methods. I like this book. There is a pdf out there.

For your general erudition it is important to understand the derivation of the heat equation in terms of a Fourier series. The heat equation is just a special case of the diffusion equation, and the diffusion equation is everywhere.

PDEs are used in computer vision. Of course they are. But you can probably skip the class itself unless you happen to become interested in one of the fields where they play a more central role. For most of computer vision my advice is to focus on these things:

  • Linear algebra
  • Numerical methods
  • Modern software engineering in C++
  • Domain specific knowledge - read Szeliski's book.

    PS: Studying abroad is worth the cost, even if you have to delay graduation and pay for it with student loans.
u/[deleted] · 1 pointr/ComputerEngineering

It might not be exactly what you're looking for, but my school has a required freshman computer architecture class based around the LC-3 architecture. It's more simple than x86 or ARM, but it can run the C language, and in my experience, is better for an intro to architecture.

The slides are all online for free: https://wiki.illinois.edu/wiki/pages/viewpage.action?pageId=697870760

The first ~half of the slides to my memory aren't architecture specific, and they have lots of info on gate level architecture, which seems like what you are looking for.

There is also a companion book to the course that wasn't required, but was pretty helpful/well written: https://www.amazon.com/Introduction-Computing-Systems-Gates-Beyond/dp/0072467509/ref=sr_1_1?keywords=patt+and+patel&qid=1566327880&s=gateway&sr=8-1

And if you want to look into LC-3 assembly coding, there are LC-3 simulators/compilers available for download online as well.

u/BirdsGetTheGirls · 1 pointr/ComputerEngineering

I'm a bit further than you, I got a basic kit off amazon and have slowly been adding to it, but am still doing some basic things because I've only recently been able to play with it more.

There's 2 major types of boards that's advertised to beginners : Arduino (and their many clones) which are very easy to program. You have outputs you can turn on and off very quickly with, as well as a 5v and 3.3v output, and then some inputs. It can't do a ton of calculations however. This is very easy to get started with.

You also have rasberry Pis, which are about the same size but run an OS (like linux). These also have pins that can transmit/recieve, but are unable to transmit any real power that an arduino can.

---
So your first choice is to figure out if you want to play with components (arduino) or much more programming/computation based things (rasberry pi)

Like you, I'm running into the brick wall of choices. I'm sure there are better ones out there than this one.
https://www.amazon.com/Smraza-Ultimate-Ultrasonic-Mega2560-Projects/dp/B01L0ZL8N6/ref=pd_day0_21_4?_encoding=UTF8&pd_rd_i=B01L0ZL8N6&pd_rd_r=YKXVW5EZ2PZZ5K5V9J85&pd_rd_w=heGtI&pd_rd_wg=xi5mX&psc=1&refRID=YKXVW5EZ2PZZ5K5V9J85

It's a nice little sampler of things. You got resistors, leds, a 595, some motors, things that make sound, some displays, wires, ribbons, a breadboard. Not much in the way of capacitors though.

That kit will probably let you do a lot of the basic tutorials.

^What ^you'll ^find ^soon ^though, ^is ^you'll ^need ^more ^LEDs ^for ^something. ^So ^you ^buy ^a ^$4 ^thing ^of ^100 ^LEDs. ^Sit ^down ^to ^use ^them, ^shoot ^I'm ^out ^of ^330R ^resistors. ^So ^you'll ^buy ^a ^$9 ^pack ^of ^like ^1500 ^resistors. ^One ^day ^you'll ^go ^"You ^know, ^I ^have ^no ^idea ^how ^capacitors ^work. ^I ^should ^get ^some" ^so ^you ^spend ^$6 ^on ^a ^box ^of ^too ^many ^capacitors.

^That's ^the ^stage ^of ^where ^I'm ^at. ^I ^want ^to ^learn ^how ^a ^component ^works, ^or ^how ^to ^use ^it, ^but ^the ^base ^kit ^I ^got ^doesn't ^have ^enough, ^so ^then ^I ^end ^up ^with ^a ^pouch ^full ^of ^them.

^You ^may ^also ^want ^to ^get ^either ^a ^multimeter, ^or ^a ^super ^cheap ^oscilloscope ^off ^ebay. ^A ^small ^spool ^of ^wire ^+ ^wirecutters ^makes ^management ^much ^easier ^on ^a ^breadboard ^because ^you ^can ^make ^wires ^exactly ^the ^right ^length.

Don't be afraid to fuck up. So what if you burn out an LED, you learned something. Made the motor smoke now did ya? Take it apart, learn how it works. Dead arduino? See if you can't trouble shoot the problem.

u/pencan · 3 pointsr/ComputerEngineering

Try the RISCV reader book. It’s pretty good, even if the privileged section is somewhat out of date

EDIT: link to book https://www.amazon.com/RISC-V-Reader-Open-Architecture-Atlas/dp/0999249118

u/KyleRochi · 3 pointsr/ComputerEngineering

Codecadmy! I recommend python or ruby. They are pretty easy languages to pick up, so you will have a good understanding of programming concepts when you start doing C/C++ or java. Also for digital logic I recommend picking up a copy of [Code](Code: The Hidden Language of Computer Hardware and Software https://www.amazon.com/dp/0735611319/ref=cm_sw_r_cp_api_nxOAyb12J4N87) by Charles Petzold. It is by no means a comprehensive guide, but you will be familiar with everything when you take a logic class and while most of the class is trying to figure out what an adder is you will already know and be focusing on how and why it works

u/revolution801 · 1 pointr/ComputerEngineering

This was recently asked on r/mechanicalkeyboards and the suggestion was: https://www.amazon.com/Glorious-Modular-Mechanical-Gaming-Keyboard/dp/B01D8YNJH0/ref=sr_1_1?ie=UTF8&qid=1542998339&sr=8-1&keywords=gmmk

It allows you to easily switch out the key switches do you can try out multiple types of you're unsure of what you like. Also has RGB which is nice.

u/kubiesnacks123 · 3 pointsr/ComputerEngineering

I just bought this l. I’m a computer engineer major. Elegoo EL-KIT-003 UNO Project Super Starter Kit with Tutorial for Arduino https://www.amazon.com/dp/B01D8KOZF4/ref=cm_sw_r_cp_api_i_uys.AbJ98HNDX

u/zkSNARK · 3 pointsr/ComputerEngineering

If you wanna go deeper with the hardware, this is the book my university used. It contains a lifetime of knowledge. However, it is nowhere close to the readability of Code. Where I found code to be friendly and inviting, this book is more of a grind through 100 pages in 2 months and question your existence type of thing. For OS stuff, we used this one. I'd say its a lot more friendly to read than the architecture book, but really as you go deeper into both of these subjects, they don't get friendlier.

u/soyPETE · 1 pointr/ComputerEngineering

It depends on what you are wanting to go? You can start with this:


Effective Python: 59 Specific Ways to Write Better Python (Effective Software Development Series) https://www.amazon.com/dp/0134034287/ref=cm_sw_r_cp_api_w-O0AbM4Z2FJR

u/MushinZero · 4 pointsr/ComputerEngineering

Do you understand a current ISA? This will become clearer once you do.

MIPS or RISC-V are the recommended ones to start with.

https://smile.amazon.com/dp/0124077269/ref=cm_sw_em_r_mt_dp_U_kfG4Db0D3JR91

https://smile.amazon.com/dp/0128122757/ref=cm_sw_em_r_mt_dp_U_hfG4DbCTTF7H3

Also, it is going to be much faster to implement a processor in an HDL than in Minecraft.

u/QuoteMe-Bot · 3 pointsr/ComputerEngineering

> We use vivado in school and they teach verilog. My impression is that VHDL is more of an industry standard, but I'm still a student so don't quote me on that. The way my university introduced digital logic was by having us start at logic gate level then use those modules to make state machines and last semester we made a MIPS processor.

>
Vivado (web pack should be free)
https://www.xilinx.com/products/design-tools/vivado.html

> Here is the book we used for the processor
https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269

~ /u/laneLazerBeamz

u/laneLazerBeamz · 1 pointr/ComputerEngineering

We use vivado in school and they teach verilog. My impression is that VHDL is more of an industry standard, but I'm still a student so don't quote me on that. The way my university introduced digital logic was by having us start at logic gate level then use those modules to make state machines and last semester we made a MIPS processor.


Vivado (web pack should be free)
https://www.xilinx.com/products/design-tools/vivado.html

Here is the book we used for the processor
https://www.amazon.com/Computer-Organization-Design-Fifth-Architecture/dp/0124077269