vector - Are bulk array operations faster than sequential operations? -


say given array, , function called replace:

void replace(from, to, items[]) 

whose job replace array elements in range [from, to) elements in items.
i'll assume maximum size of array known beforehand, can ensure array never overflow.

my question is, if given list of replacements (e.g. elements of form (from, to, items)), possible me obtain final resulting array faster time complexity performing each operation sequentially?

in other words, there advantage knowing sequence of operations beforehand, or no better being given each operation 1 one (in terms of asymptotic time complexity)?

note: seems question confusing; did not intend imply number of elements replacing given range same size of range! fewer or more, causing shifting, , point of question was ask if knowing them beforehand avoid work shifting in worst case.

i think may not finding, using rope can shorten time-complexity. provides operation on string concat without "shift"ing actual array. (not needed string of char. arbitary type of element can used alphabets)

according wikipedia (...because have not ever used yet), rope balaced binary tree of fragmented string. rope represents long string concatenated fragmented string. replace() function can implemented using split() , concat() operation on rope, both operations takes o(log(n)) time. here, put n length of long string rope represents.

to normal array, rope can converted in o(n) time using report() operation.

so answer is: there algorithm faster sequencially applies string operation.

  • convert given array rope
  • apply every replace job on rope. each operation runs in o(log(n)) , not copy actual array items.
  • convert processed rope real array (this causes copy of items).
  • return it.

Popular posts from this blog

How to calculate SNR of signals in MATLAB? -

c# - Attempting to upload to FTP: System.Net.WebException: System error -

ios - UISlider customization: how to properly add shadow to custom knob image -