Opposition versus randomness in soft computing techniques

Shahryar Rahnamayan, Hamid R. Tizhoosh, Magdy M.A. Salama

Research output: Contribution to journalArticlepeer-review


For many soft computing methods, we need to generate random numbers to use either as initial estimates or during the learning and search process. Recently, results for evolutionary algorithms, reinforcement learning and neural networks have been reported which indicate that the simultaneous consideration of randomness and opposition is more advantageous than pure randomness. This new scheme, called opposition-based learning, has the apparent effect of accelerating soft computing algorithms. This paper mathematically and also experimentally proves this advantage and, as an application, applies that to accelerate differential evolution (DE). By taking advantage of random numbers and their opposites, the optimization, search or learning process in many soft computing techniques can be accelerated when there is no a priori knowledge about the solution. The mathematical proofs and the results of conducted experiments confirm each other.

Original languageEnglish (US)
Pages (from-to)906-918
Number of pages13
JournalApplied Soft Computing Journal
Issue number2
StatePublished - Mar 2008


  • Differential evolution
  • Opposite numbers
  • Opposition-based learning
  • Random numbers
  • Soft computing

ASJC Scopus subject areas

  • Software


Dive into the research topics of 'Opposition versus randomness in soft computing techniques'. Together they form a unique fingerprint.

Cite this