Jumping back on board the digital train - part IV - diyAudio
Go Back   Home > Forums > Blogs > abraxalito

Please consider donating to help us continue to serve you.

Ads on/off / Custom Title / More PMs / More album space / Advanced printing & mass image saving
Rate this Entry

Jumping back on board the digital train - part IV

Posted 6th May 2011 at 03:52 AM by abraxalito

Where is all this going I hear you ask? What's his point? Well to tell the truth I don't hear anyone asking this - the comments sections are remarkably empty save for jkeny egging me on

So here's my point, and its a single word: convergence. Convergence is coming to digital architectures - indeed its already here, just the majority of people have yet to notice.

The diverse marketplace for embedded processors is rather similar to the market for home computers in the early to mid-1980s. It was hard to make the choice - Acorn, Sinclair, Apple, Commodore, Atari, Amstrad, Dragon? Then convergence arrived in the shape of the IBM PC and those brands (with the exception of Apple) were relegated to the history books.

It took quite a while after the 1983 arrival of the PC for this to occur. I had an Acorn Atom which I built from a kit in the long summer vacation after I graduated, and I didn't make the jump into PC-land until the early 1990s. It wasn't just the expense of PCs which kept me out, it was the clunkiness of DOS and the general horribleness of the overall x86 architecture. Windows 3.1 changed that for me along with the 486. I was converted into a user even though not a true believer

One of the killer apps, perhaps the killer app for me was the Protel schematic and PCB design package, one of the first to run under Windows. I think the software cost a bit more than the PC hardware at the time (I bought an SX25 not a DX66), but boy was it worth it

IBM succeeded with the PC I believe because they kept the architecture open to cloners rather than declaring it 'proprietary', protecting it and engaging legal teams to defend it. So it became the de facto standard when all the other players were 'doing their own thing'. The company in the embedded space that's the closest in spirit to IBM is the subject of my next entry. Many of you will be able already to guess who that is...
Posted in Digital
Views 755 Comments 15 Email Blog Entry
Total Comments 15

Comments

  1. Old Comment
    Yes, agreed Richard but IBM didn't intentionally keep it open source as a business model - they just underestimated it's significance & weren't treating the PC as a serious Business Machine. It wasn't a mainframe, after all & only cost a minute fraction of the cost of leasing a mainframe. So I believe the success of the IBM PC was a because of IBM's taking it's eye off the ball. Look what the did next & what happened - they decided to use a proprietary with the PS/2 & OS/2 & failed!

    Anyway, keep it going, I believe you are correct the embedded processor world is probably at a stage of development that the PC was back then & it's an interesting treatise!
    permalink
    Posted 6th May 2011 at 09:45 AM by jkeny jkeny is offline
  2. Old Comment
    abraxalito's Avatar
    Quote:
    Originally Posted by jkeny View Comment
    Yes, agreed Richard but IBM didn't intentionally keep it open source as a business model - they just underestimated it's significance & weren't treating the PC as a serious Business Machine.
    Yeah, I'm not talking here about 'open source' just open standards. So you reckon IBM kept it open by mistake - if they valued it then they'd have kept it proprietary? As a contrast, look at DEC - they're nowhere now so I guess they also took their eye off the ball...
    permalink
    Posted 6th May 2011 at 12:06 PM by abraxalito abraxalito is offline
  3. Old Comment
    wintermute's Avatar
    DEC got bought out by HP and they buried them. It was a shame as the alpha was a nice chip, and some of DEC's technology was very nice.

    On the IBM thing, in the early days IBM was trying to sue the cloners if I remember correctly but it was a losing battle which is why they went down the PS2 route I believe. I can remember needing to replace a floppy drive in a work PS2. At the time a clone 1.44MB floppy drive cost around $80.00 AU. The PS2 one was a proprietry (external) size and the clone one wouldn't fit. The genuine PS2 floppy drive was $800.00 AU! It is no wonder PS2's died an awful death.

    Also I believe that Os2 was originally a joint effort between Micro$0ft and IBM, but Micro$0ft pulled out and went off and did either Windows 95 or NT (can't remember which).

    Back on the topic of embedded microprocessors, I can remember studying the Intel i960 at uni when it first came out. That was one CPU from Intel that I actually had some respect for. Never had a chance to play with one, but I do remember it made an impression at the time

    In the end it was Micro$0ft rather than IBM or the clones that put the PC where it is today. There were plenty of superior products available, but the problem was they were poorly marketed. The Microsoft/X86 combo had plenty not to like but due to superior marketing (and a growing application base) became the platform of choice.

    Tony.
    permalink
    Posted 6th May 2011 at 12:45 PM by wintermute wintermute is offline
  4. Old Comment
    [QUOTE=abraxalito;bt1213]Yeah, I'm not talking here about 'open source' just open standards. So you reckon IBM kept it open by mistake - if they valued it then they'd have kept it proprietary? As a contrast, look at DEC - they're nowhere now so I guess they also took their eye off the ball...[/QUOTE]
    This may be completely wrong but I believe IBM manufactured their PC to satisfy their Mainframe customers that wanted to use Visicalc & it wasn't a mainframe program. So in this sense, I think they didn't anticipate the demand - real computing happened on mainframes :).

    Once IBM was behind the PC, it changed the public's perception of these computers. But the cloners eventually took the major market share & IBM then tried to counter this by introducing the closed design PS/2 hardware & OS/2 operating system - a joint development with MS. MS were able to take their code away & develop Windows. IBM continued to work on OS/2 & eventually released it ( a much more robust & "real" OS) but Windows had already captured the public's imagination & stole the market lead.

    Anyway, now we might be seeing the next technology wave as you say - embedded processors with Linux perhaps?
    permalink
    Posted 6th May 2011 at 10:15 PM by jkeny jkeny is offline
  5. Old Comment
    BTW, a new manufacturing process from Intel may have significance in future embedded space - 3D transistors or [URL="http://www.pcworld.com/article/227142/intel_to_bring_3d_transistors_to_nextgeneration_chips.html"]tri-gate transistors[/URL]
    permalink
    Posted 6th May 2011 at 10:22 PM by jkeny jkeny is offline
  6. Old Comment
    abraxalito's Avatar
    Quote:
    Originally Posted by jkeny View Comment
    BTW, a new manufacturing process from Intel may have significance in future embedded space - 3D transistors
    Yeah, read about that on AnandTech - looks interesting but seems to me it'll just act as a way to get to smaller effective geometries quicker. I wonder how they handle the power aspects as those fins don't look to have good thermal paths to the substrate.
    permalink
    Posted 8th May 2011 at 01:28 AM by abraxalito abraxalito is offline
  7. Old Comment
    I don't find myself particularly devoted to any system in particular, but as you say, PICs are very well supported. I have used the 32 bit PICs with some success.

    I like 'C' and I like libraries that handle the setting up of the hardware registers for me. I feel inferior in that I know nothing about FPGAs and serious digital hardware.
    permalink
    Posted 8th May 2011 at 12:26 PM by CopperTop CopperTop is offline
  8. Old Comment
    abraxalito's Avatar
    I had a brief look at PIC32 - they're using MIPs architecture, not something home-grown. What application(s) did you find for them?

    For you, 'serious' digital hardware is ECL or just complex stuff? I've never programmed an FPGA either.
    permalink
    Posted 8th May 2011 at 02:35 PM by abraxalito abraxalito is offline
  9. Old Comment
    I did some filtering of pulse trains with the PIC32, where I needed pretty high speed. I literally took pulses in on a few pins, decided whether to pass them on based on timing, and set outputs accordingly. On that occasion I programmed in assembler, but I wonder whether I gained very much over 'C'..?

    I didn't mean to 'diss' microcontrollers, earlier by the way. It's just that with 'C' compilers and nice development tools it can feel so easy and enjoyable to use the things, whereas being a professional should involve pain, tedium and hard work..? :-)

    I'm aware that in microcontrollers I've got a really flexible bit of 'glue', but if I need to get serious and embed some timecode on digital video, say, I will need to go through the learning curve of understanding FPGAs or PLDs and it fills me with dread. Not that I wouldn't eventually get the hang of it, but that it just doesn't look like fun!
    permalink
    Posted 8th May 2011 at 05:21 PM by CopperTop CopperTop is offline
  10. Old Comment
    abraxalito's Avatar
    As you seem such a big fan of C then have a look at XMOS - they look very good at doing pulse measurement at high speed (100MHz). As you were using timer peripherals to do most of the work, I guess C wasn't much of a disadvantage over assembler.

    I didn't read any 'dissing' into what you wrote btw I don't find writing assembler to be 'pain, tedium and hard work' - not sure why not, just I find it a load of fun to choose the optimum instructions for a particular job and have what feels like 'perfect control' over what goes on. I very much still enjoy the slick development tools
    permalink
    Posted 9th May 2011 at 12:10 AM by abraxalito abraxalito is offline
  11. Old Comment
    I don't know about you, Abraxalito, but for me, the problem was never programming, as such - I started when I was young, so it's second nature. The main problem was always setting up the 'infrastructure' that was needed to get programming. There used to be so much investment of effort and understanding needed in that, that once I had got to the point of 'Hello World' I would stick with a particular processor, and maybe go through contortions to avoid having to change makefiles and such proprietary stuff. Other people seemed to make a virtue of knowing all that side of it, but then seemed to find the programming part of the job quite difficult and un-enjoyable.

    To me, if there's a competition between spending time understanding makefiles, or fading an LED in and out smoothly, I'll take the LED every time. And then when I find that a linear fade looks wrong, I'll massage the code to make it logarithmic, or whatever. The conventional engineer (in my mind) doesn't worry about the look of the LED as long as it meets the spec to the letter, but will take great delight in editing the makefile to speed up compilation by 2%.

    As you can tell, the slick development tools are a godsend to me - not that I use most of the snazzy functionality. The main thing is that I can persuade myself that my approach has been vindicated because I need never know anything about the makefile!
    permalink
    Posted 9th May 2011 at 09:21 AM by CopperTop CopperTop is offline
  12. Old Comment
    abraxalito's Avatar
    Yeah, I agree with you - the programming is the fun bit, the tedium comes from all the setting up and learning of 'infrastructure'. That's more or less what I was saying in the first post - once we learn our way around one infrastructure we tend to stick with it through thick or thin because to go through that all over again is too terrifying to contemplate. That's why I've take so long (at least two years, on and off) choosing my next infrastructure, because I'd like it to be the last and therefore have more time available for the fun part! You sound very similar to me in that you want to get to the 'results' part by whatever means gets you there with least effort. The results part - the interactivity - where we get feedback on whether it works, is the incentive to the next burst of creativity.
    permalink
    Posted 9th May 2011 at 11:05 AM by abraxalito abraxalito is offline
  13. Old Comment
    Yes, that's exactly how I think about it. The setting up of the 'infrastructure' should simply be automated.

    I am happy to say that in all of my recent microcontroller work, I have been completely unaware of anything as unpleasant as a memory map. With the dsPIC family (16 bit, but fast, about a fiver) I took the chip that I had never used before, connected a few power supply pins, a 6 pin header and a resistor (no clock or crystal, or reset generator necessary!), plugged in the low cost USB programmer device, created a 'new project' and wrote a two line C program to initialise a UART and print a character to it. I think it was as instant as that. I think I then connected one of those FTDI USB modules to the UART pins and straight away I was in business with a 30 MIPS processor, full COM port style interface, and a means of powering the CPU via USB. The development interface has a drop down menu to let you select the clock source (built in RC oscillator or external xtal, built in PLL so you can choose blisteringly fast, or low current with same clock source), watchdog timer settings etc. and these are programmed into the chip automatically.

    Once you are at that stage, you know that you're not being shielded from reality with a development board (you've literally wired the chip up yourself) so you can easily see the path to developing a real piece of hardware. But you're also in a position to explore all the peripherals: print out ADC values continuously and touch your fingers on the pins to see the hum... do some PWM... save some values to the in-built EEPROM (oh yes!) etc. And because you've got the USB interface you can do things like log stuff to the PC, or do a real time plot of the ADC values.

    I may be teaching a grandmother to suck eggs, but on the other hand if you haven't been doing this stuff for a few years you may not realise just how nice they've made it for people like us!
    permalink
    Posted 9th May 2011 at 10:52 PM by CopperTop CopperTop is offline
  14. Old Comment
    abraxalito's Avatar
    My blog does somewhat lag behind real-time so yes I have experienced some of what you've just written about. I was about to write almost the same thing in my next blog post but now I can just direct readers to this

    My only point of disagreement is that memory maps are not to me unpleasant things, they're needed to know what's in range of the offset addressing. Of course, to a C programmer the compiler will deal with this. Thanks for sharing!
    permalink
    Posted 9th May 2011 at 11:16 PM by abraxalito abraxalito is offline
  15. Old Comment
    I like the Microchip PICs myself, I got into them because of the low cost of entry. The first ones I used I programmed with a DIY programmer that ran off the PC serial port, it cost virtually nothing. Later I moved to the USB Pickit2, which is cheap (cheaper if you buy a clone) and which programs a huge variety of PICs. All you need is a 6-pin header on the board if you're using surface mount or you can make a programming board for leaded devices on a bit of veroboard.

    I'm not much influenced by the need to learn new languages, I've written easily a dozen to date. I tend to steer away from the systems that are designed to make programming 'accessible' though, they tend to be easy when you want to do simple things but idiosyncratic when it comes to the more sophisticated. I have programmed quite a few devices in assembler; 6502, Z80, x86, 68000, PIC, you can program the PICs in Hitech C which is free too. If I wanted a bit more power I'd go to TMS DSPs, PowerQUICC or an FPGA with a processor core, there are some open-source processors out there and I've already got a Xilinx download cable. There's free Xilinx software for many of their devices, although you have to pay if you want to use the biggest ones. Lattice and Altera devices are fine, but Xilinx is the mainstream.

    In order to be able to pick the most efficient implementations and best organise any support and peripherals you should have access to programming in one of the HDLs though. VHDL, Verilog or AHDL. This means in many instances you can design the board before the details of the hardware implementation have been finalised. A small CPLD can soak up all your glue logic which can end up quite extensive, expensive in board real estate and routing once you add a display. Pin compatible devices with different numbers of gate counts are available and that and the routing flexibility can get you out of a fix sometimes. Contrary to CopperTop's expectations there are few things more fun than programming an FPGA. Programming for the web might be one of them.

    w
    permalink
    Posted 13th May 2011 at 03:05 AM by wakibaki wakibaki is offline
 
Hide this!Advertise here!

New To Site? Need Help?

All times are GMT. The time now is 02:06 AM.


vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2014 DragonByte Technologies Ltd.
Copyright 1999-2014 diyAudio

Content Relevant URLs by vBSEO 3.3.2